[Synthesis] networked cameras; potential response to activity vs actual activity

Given Tain, Cooper and other folks’ findings, IMHO we should use wifi networked cameras but not GoPro’s.  (Can we borrow two from somewhere hort term to get the video network going?)  Comments on networked cameras?  
But let’s keep the focus on portals for sound and gesture / rhythm.

Keep in mind the point of the video portal is not to look AT an image in a rectangle, but to suture different regions of space together. 

Cooper’s diagram (see attached) offers a chance to make another distinction about Synthesis research strategy, carrying on from the TML, that is experientially and conceptually quite different from representationalist or telementationalist uses of signs.  

(Another key tactic: to eschew allegorical art.)

A main Synthesis research interest here is not about looking-at an out-of-reach image far from the bodies of the inhabitants of the media space, 
but how to either:;
(1) use varying light fields to induce senses of rhythm and co-movement
(2) animate puppets (whether made of physical material, light, or light on matter).

The more fundamental research is not the actual image or sound, but the artful design of potential responsivity of the activated media to action.
This goes for projected video, lighting, acoustics, synthesized sound, as well as kinetic objects.
All we have to do this summer is to build out enough kits or aspects of the media system to demonstrate these ways of working and in the formate of some attractive installations installed in the ordinary space.

Re. the proto-rearch goals this month, we do NOT have to pre-think the design of the whole of the end-user experience, but to build some kits that would allow us to demonstrate the promise of this approach to VIP visitors in July, students in August, and to work with in September when everyone’s back.

Cheers for the smart work to date,
Xin Wei

Re: Go pro fact finding research

Hi all, 

I would also like to add some thoughts on using GoPro.  Well to begin with, I am a huge fan of GoPro cameras, and I was testing GoPro Hero 3+ Black Edition (the latest model) last April.  It would be great if AME purchase new GoPro cameras but maybe it is not the best choice for Brickyard projects.


[Image 1] Six GoPros on a special mount made from a 3D printer.  Didn't really work under Arizona weather.  Photos taken near Tempe.

As far as I remember, ideas of using GoPro cameras started because we wanted to locate cameras outside of the building without any wires connected to the computers.  

GoPro cameras are small, light weight, feature loaded, and are designed to be put under extreme conditions.  However, as they are not built for long-term use,  they have battery issue, overheating issue, and Wifi range issues.  Battery last less than an hour, and if we connect a source of electricity the camera will overheat and turn off once a while.

I also found ways to stream full video into a web browser or VLC media player () and a Mac app controlling and doing low resolution live previews ().  Unfortunately, most of these solutions are for low resolution previews.  Trying GoPro cameras in Max/jitter might still be a very interesting and challenging project, but we can also consider different options.

For example, wifi network cameras can be another good choice.  They are designed specifically for long-term live streaming without being connected to the computer.  Whenever the camera is connected to power source and network, it doesn't even need to be located near the computers.  Image below shows my previous project that used two network cameras in an outdoor environment.

[Image 2] Network camera was located outside of the building.  All the machines controlling the LED ceiling were in a different building so we couldn't use wires to connect camera to computer.

Please let me know your ideas and thoughts.

Much appreciated,
Cooper


On Fri, Jun 27, 2014 at 9:21 AM, Sha Xin Wei <shaxinwei@gmail.com> wrote:
Very likely standard httpd would be a bad way to send video since it was designed for sending bytes of characters or a few embedded media objects.   But   Max jit.*  used to read   rtsp video streams as well  ( in the syntax  rtsp://  ?  )

Such long latencies would make GoPro impossible for synchronous action, but we should also think about how to intercalate activity based on durational rhythm, not point-sync’d instantaneous events.

Also, time delaying the feed by 3 hours would be interesting first step to intercalating Montreal and Phoenix

Xin Wei

__________________________________________________________________________________
Professor and Director • School of Arts, Media and Engineering • Herberger Institute for Design and the Arts / Director • Synthesis / ASU
Founding Director, Topological Media Lab / topologicalmedialab.net/  /  skype: shaxinwei / +1-650-815-9962
__________________________________________________________________________________






On Jun 26, 2014, at 9:41 PM, Garrett Johnson <gljohns6@asu.edu> wrote:

There is a browser object in max. Not sure how it works with video but it sounds like it's worth a shot. 

Garrett L. Johnson
MA musicology 
Synthesis Center research team 
Arizona State University


Am Jun 26, 2014 um 15:15 schrieb Tain Barzso <Tain.Barzso@asu.edu>:

Should have forwarded to the group initially:

Doing some initial research which I will present to the group.  On the Go Pro:

While the wireless feature is designed specifically for controlling and doing low resolution live previews on the Go Pro app, there is a way to stream full video into a web browser via the camera's built in web server.  If any applications that synthesis is using can access this kind of stream (via http://) we should be able to grab video.  A caveat is that the live streamed video has very significant (seems to be > 1 second) lag.  This is likely dependent on resolution, and I would guess 720p video would have less lag than 1080p  - but we'd have to try it to see.

Also, it looks like there is a cable that allows us to both insert an external (3.5mm) microphone AND still utilize the camera's USB port as a power input as to avoid running off the battery.

Here is a video that a youtube user posted covering the video streaming accessibility:  https://www.youtube.com/watch?v=Y1XaBJZ8Xcg

Tain

streaming video from small cameras in a local net: Go pro fact finding research

There are many solutions for a suite of video cams.  It’s not bad to repurpose some commercial and cheaper solutions for a network of video+audio cameras. Analog solutions have much less latency compared to these digital solutions (even after an A2D converter), and they may still be a lot cheaper for a given equivalent resolution.   

Are there (still?) more variety of lens solutions available for analog cameras?

Be sure to get wide angle lenses for whatever camera solution you come up with.

Also, threaded lenses may make it easier to affix IR pass filters as necessary.

Cheers,
Xin Wei

__________________________________________________________________________________
Professor and Director • School of Arts, Media and Engineering • Herberger Institute for Design and the Arts / Director • Synthesis / ASU
Founding Director, Topological Media Lab / topologicalmedialab.net/  /  skype: shaxinwei / +1-650-815-9962
__________________________________________________________________________________

Re: Brickyard media systems architecture; notes for Thursday BY checkin

Hey All, 

Julian and I are here. Are you guys on Skype or google + ? 


(sent from a phone)

On Jun 26, 2014, at 5:42 PM, Byron Lahey <byron.lahey@asu.edu> wrote:



On Thu, Jun 26, 2014 at 10:51 AM, Katie Jung <katiejung1@gmail.com> wrote:
Hi All,
I will beam in online.
See you then!


On Thu, Jun 26, 2014 at 11:48 AM, Peter Weisman <peter.weisman@asu.edu> wrote:
> Yes. Lets all meet at Brickyard.
>
> Pete
>
>
> Sent from my Galaxy S®III
>
>
> -------- Original message --------
> From: Garrett Laroy Johnson
> Date:06/26/2014 8:41 AM (GMT-07:00)
> To: Tain Barzso
> Cc: Peter Weisman , Sha Xin Wei , Katie Jung , Julian Stein , "Byron Lahey
> (Student)" , Assegid Kidane , Garrett Laroy Johnson , Kevin Klawinski ,
> "Matthew Briggs (Student)" , Xin Wei Sha , post@synthesis.posthaven.com
> Subject: Re: Brickyard media systems architecture; notes for Thursday BY
> checkin
>
> 2.30 is good for me too. Meet at BY?
>
>
> Garrett
>
>
> On Jun 26, 2014, at 8:34 AM, Tain Barzso <Tain.Barzso@asu.edu> wrote:
>
> 2:30 works for me, as well.
>
> From: Peter Weisman <peter.weisman@asu.edu>
> Date: Wed, 25 Jun 2014 21:07:34 -0700
> To: Sha Xin Wei <shaxinwei@gmail.com>, Peter Weisman
> <peter.weisman@asu.edu>, Katie Jung <katiejung1@gmail.com>, Julian Stein
> <julian.stein@gmail.com>, "Byron Lahey (Student)" <Byron.Lahey@asu.edu>,
> Assegid Kidane <Assegid.Kidane@asu.edu>, Tain Barzso <Tain.Barzso@asu.edu>,
> Garrett Laroy Johnson <gljohns6@asu.edu>, Kevin Klawinski
> <kklawins@asu.edu>, "Matthew Briggs (Student)" <Matthew.Briggs@asu.edu>
> Cc: Xin Wei Sha <Xinwei.Sha@asu.edu>, "post@synthesis.posthaven.com"
> <post@synthesis.posthaven.com>
> Subject: RE: Brickyard media systems architecture; notes for Thursday BY
> checkin
>
> I am free at 2:30pm AZ time on Thursday.  Can we all meet then?
>
> Pete
>
>
> Sent from my Galaxy S®III
>
>
> -------- Original message --------
> From: Sha Xin Wei
> Date:06/25/2014 8:14 PM (GMT-07:00)
> To: Peter Weisman , Katie Jung , Julian Stein , "Byron Lahey (Student)" ,
> Assegid Kidane , Tain Barzso , Garrett Laroy Johnson , Kevin Klawinski ,
> "Matthew Briggs (Student)"
> Cc: Xin Wei Sha , post@synthesis.posthaven.com
> Subject: Brickyard media systems architecture; notes for Thursday BY checkin
>
> Hi Everyone,
>
> This note is primarily about media systems, but should connect with the
> physical interior design, furniture and lighting etc.
>
> Thanks Pete for the proposed media systems architecture summary.  I started
> to edit the doc but then bogged down because the architecture was
> computer-centric, and it’s clearer as a diagram.
>
> Instead of designing around the 2 Mac Minis and the Mac Pro computers I’d
> like to design the architecture around relatively autonomous i/o devices :
>
> Effectively a network of:
> Lamps (for now addressed via DMX dimmers), floor and desk lamps as needed to
> provide comfortable lighting
> 5 Wifi GoPro audio-video cameras (3 in BY, 1 Stauffer, 1 iStage)
> xOSC-connected sensors and stepper motors,  ask Byron
> 1 black Mac Pro on internet switch
> processing video streams, e.g. extracting video feature emitted in OSC
> 1 Mac Minis on internet switch
> running lighting master control Max patch, ozone media state engine (later)
> 1 Mac Mini on internet switch
> processing audio, e.g. extracting audio features emitted in OSC
> 1 iPad running Ozone control panel under Mira
> 1 Internet switch in Synthesis incubator room for drop-in laptop processing
> 8 x transducers (or Genelecs — distinct from use in lab or iStage, I prefer
> not to have “visible” speaker boxes or projectors in BY — perhaps flown in
> corners or inserted inside Kevin’s boxes, but not on tables or the floor )
>
> The designer-inhabitants should be able to physically move the lamps and
> GoPros and stepper motor’ed objects around the room, replugging into few
> cabled interface devices (dimmer boxes) as necessary.
>
> Do the GoPro’s have audio in?   Can we run contact or boundary or air mic
> into one ?
>
> We could run ethernet from our 3 computers to the nearest wall ethernet
> port?
>
> Drop-in researchers should be able to receive via OSC whatever sensor data
> streams (e.g. video and audio feature streams cooked by the fixed black
> MacPro and audio Mini)
> and also emit audio / video / control OSC into this BY network.
>
> DITTO iStage.
>
> Can we organize an architecture session Thursday sometime ?
> I’m sorry since I have to get my mother into post-accident care here in
> Washington DC I don’t know when I’ll be free.
>
> Perhaps Thursday sometime between 2 and 5 PM Phoenix?  I will make myself
> available for this because I’d like to take up a more active role now thanks
> to your smart work.
>
> Looking forward to it!
> Xin Wei
>
>
> PS. Please let’s email our design-build to post@synthesis.posthaven.com  so
> we can capture the design process and build on accumulated knowledge in
> http://synthesis.posthaven.com
>
> __________________________________________________________________________________
> Sha Xin Wei
> Professor and Director • School of Arts, Media and Engineering • Herberger
> Institute for Design and the Arts / Director • Synthesis / ASU
> Founding Director, Topological Media Lab / topologicalmedialab.net/  /
> skype: shaxinwei / +1-650-815-9962
> __________________________________________________________________________________
>
>
>
>
>
>



--
***
Katie Jung
(514) 502 0205

Re: Brickyard media systems architecture; notes for Thursday BY checkin



On Thu, Jun 26, 2014 at 10:51 AM, Katie Jung <katiejung1@gmail.com> wrote:
Hi All,
I will beam in online.
See you then!


On Thu, Jun 26, 2014 at 11:48 AM, Peter Weisman <peter.weisman@asu.edu> wrote:
> Yes. Lets all meet at Brickyard.
>
> Pete
>
>
> Sent from my Galaxy S®III
>
>
> -------- Original message --------
> From: Garrett Laroy Johnson
> Date:06/26/2014 8:41 AM (GMT-07:00)
> To: Tain Barzso
> Cc: Peter Weisman , Sha Xin Wei , Katie Jung , Julian Stein , "Byron Lahey
> (Student)" , Assegid Kidane , Garrett Laroy Johnson , Kevin Klawinski ,
> "Matthew Briggs (Student)" , Xin Wei Sha , post@synthesis.posthaven.com
> Subject: Re: Brickyard media systems architecture; notes for Thursday BY
> checkin
>
> 2.30 is good for me too. Meet at BY?
>
>
> Garrett
>
>
> On Jun 26, 2014, at 8:34 AM, Tain Barzso <Tain.Barzso@asu.edu> wrote:
>
> 2:30 works for me, as well.
>
> From: Peter Weisman <peter.weisman@asu.edu>
> Date: Wed, 25 Jun 2014 21:07:34 -0700
> To: Sha Xin Wei <shaxinwei@gmail.com>, Peter Weisman
> <peter.weisman@asu.edu>, Katie Jung <katiejung1@gmail.com>, Julian Stein
> <julian.stein@gmail.com>, "Byron Lahey (Student)" <Byron.Lahey@asu.edu>,
> Assegid Kidane <Assegid.Kidane@asu.edu>, Tain Barzso <Tain.Barzso@asu.edu>,
> Garrett Laroy Johnson <gljohns6@asu.edu>, Kevin Klawinski
> <kklawins@asu.edu>, "Matthew Briggs (Student)" <Matthew.Briggs@asu.edu>
> Cc: Xin Wei Sha <Xinwei.Sha@asu.edu>, "post@synthesis.posthaven.com"
> <post@synthesis.posthaven.com>
> Subject: RE: Brickyard media systems architecture; notes for Thursday BY
> checkin
>
> I am free at 2:30pm AZ time on Thursday.  Can we all meet then?
>
> Pete
>
>
> Sent from my Galaxy S®III
>
>
> -------- Original message --------
> From: Sha Xin Wei
> Date:06/25/2014 8:14 PM (GMT-07:00)
> To: Peter Weisman , Katie Jung , Julian Stein , "Byron Lahey (Student)" ,
> Assegid Kidane , Tain Barzso , Garrett Laroy Johnson , Kevin Klawinski ,
> "Matthew Briggs (Student)"
> Cc: Xin Wei Sha , post@synthesis.posthaven.com
> Subject: Brickyard media systems architecture; notes for Thursday BY checkin
>
> Hi Everyone,
>
> This note is primarily about media systems, but should connect with the
> physical interior design, furniture and lighting etc.
>
> Thanks Pete for the proposed media systems architecture summary.  I started
> to edit the doc but then bogged down because the architecture was
> computer-centric, and it’s clearer as a diagram.
>
> Instead of designing around the 2 Mac Minis and the Mac Pro computers I’d
> like to design the architecture around relatively autonomous i/o devices :
>
> Effectively a network of:
> Lamps (for now addressed via DMX dimmers), floor and desk lamps as needed to
> provide comfortable lighting
> 5 Wifi GoPro audio-video cameras (3 in BY, 1 Stauffer, 1 iStage)
> xOSC-connected sensors and stepper motors,  ask Byron
> 1 black Mac Pro on internet switch
> processing video streams, e.g. extracting video feature emitted in OSC
> 1 Mac Minis on internet switch
> running lighting master control Max patch, ozone media state engine (later)
> 1 Mac Mini on internet switch
> processing audio, e.g. extracting audio features emitted in OSC
> 1 iPad running Ozone control panel under Mira
> 1 Internet switch in Synthesis incubator room for drop-in laptop processing
> 8 x transducers (or Genelecs — distinct from use in lab or iStage, I prefer
> not to have “visible” speaker boxes or projectors in BY — perhaps flown in
> corners or inserted inside Kevin’s boxes, but not on tables or the floor )
>
> The designer-inhabitants should be able to physically move the lamps and
> GoPros and stepper motor’ed objects around the room, replugging into few
> cabled interface devices (dimmer boxes) as necessary.
>
> Do the GoPro’s have audio in?   Can we run contact or boundary or air mic
> into one ?
>
> We could run ethernet from our 3 computers to the nearest wall ethernet
> port?
>
> Drop-in researchers should be able to receive via OSC whatever sensor data
> streams (e.g. video and audio feature streams cooked by the fixed black
> MacPro and audio Mini)
> and also emit audio / video / control OSC into this BY network.
>
> DITTO iStage.
>
> Can we organize an architecture session Thursday sometime ?
> I’m sorry since I have to get my mother into post-accident care here in
> Washington DC I don’t know when I’ll be free.
>
> Perhaps Thursday sometime between 2 and 5 PM Phoenix?  I will make myself
> available for this because I’d like to take up a more active role now thanks
> to your smart work.
>
> Looking forward to it!
> Xin Wei
>
>
> PS. Please let’s email our design-build to post@synthesis.posthaven.com  so
> we can capture the design process and build on accumulated knowledge in
> http://synthesis.posthaven.com
>
> __________________________________________________________________________________
> Sha Xin Wei
> Professor and Director • School of Arts, Media and Engineering • Herberger
> Institute for Design and the Arts / Director • Synthesis / ASU
> Founding Director, Topological Media Lab / topologicalmedialab.net/  /
> skype: shaxinwei / +1-650-815-9962
> __________________________________________________________________________________
>
>
>
>
>
>



--
***
Katie Jung
(514) 502 0205

Brickyard media systems architecture; notes for Thursday BY checkin

Hi Everyone,

This note is primarily about media systems, but should connect with the physical interior design, furniture and lighting etc.

Thanks Pete for the proposed media systems architecture summary.  I started to edit the doc but then bogged down because the architecture was computer-centric, and it’s clearer as a diagram.   
 
Instead of designing around the 2 Mac Minis and the Mac Pro computers I’d like to design the architecture around relatively autonomous i/o devices :

Effectively a network of:
Lamps (for now addressed via DMX dimmers), floor and desk lamps as needed to provide comfortable lighting 
5 Wifi GoPro audio-video cameras (3 in BY, 1 Stauffer, 1 iStage)
xOSC-connected sensors and stepper motors,  ask Byron 
1 black Mac Pro on internet switch
processing video streams, e.g. extracting video feature emitted in OSC
1 Mac Minis on internet switch
running lighting master control Max patch, ozone media state engine (later)
1 Mac Mini on internet switch
processing audio, e.g. extracting audio features emitted in OSC
1 iPad running Ozone control panel under Mira
1 Internet switch in Synthesis incubator room for drop-in laptop processing
8 x transducers (or Genelecs — distinct from use in lab or iStage, I prefer not to have “visible” speaker boxes or projectors in BY — perhaps flown in corners or inserted inside Kevin’s boxes, but not on tables or the floor )

The designer-inhabitants should be able to physically move the lamps and GoPros and stepper motor’ed objects around the room, replugging into few cabled interface devices (dimmer boxes) as necessary.

Do the GoPro’s have audio in?   Can we run contact or boundary or air mic into one ?

We could run ethernet from our 3 computers to the nearest wall ethernet port?

Drop-in researchers should be able to receive via OSC whatever sensor data streams (e.g. video and audio feature streams cooked by the fixed black MacPro and audio Mini)
and also emit audio / video / control OSC into this BY network. 

DITTO iStage.

Can we organize an architecture session Thursday sometime ?  
I’m sorry since I have to get my mother into post-accident care here in Washington DC I don’t know when I’ll be free.

Perhaps Thursday sometime between 2 and 5 PM Phoenix?  I will make myself available for this because I’d like to take up a more active role now thanks to your smart work.

Looking forward to it!
Xin Wei


PS. Please let’s email our design-build to post@synthesis.posthaven.com  so we can capture the design process and build on accumulated knowledge in http://synthesis.posthaven.com 

__________________________________________________________________________________
Professor and Director • School of Arts, Media and Engineering • Herberger Institute for Design and the Arts / Director • Synthesis / ASU
Founding Director, Topological Media Lab / topologicalmedialab.net/  /  skype: shaxinwei / +1-650-815-9962
__________________________________________________________________________________




MIRA TouchOSC, lighting in BY

Hi all, 
   Xin Wei + Tech team, I shall carry on here the discussion re: touchOSC vs. Mira, as this relates to the lighting thread.

Matt and I have discussed taking on on everyday lighting starting now but carrying on through the Fall.  

A la TML/iStage, I wanted to set up a Mira controller for the conference/lunch area with various presets: conference/lunch/partymode/moodlighting/dynamic. I had Mira in mind for its smooth user interface abilities, but I hadn't thought about using touchOSC (or OSC control, which is what I have found for free. Touch OSC appears to cost some $4.99). Either is ideal for scaling up. Using Julian's tml.dmx_op (+ or -), we can have multiple controllers going into the same DMX universe without discrepancy. 

Here are some links to some encouraging models, some of them more theatrical or black box space-y, but all using conventional, everyday lighting apparatuses:

 

I'm not familiar with Riemannian's work, but Leibniz monads do seem to resonate with this kind of work. The monad as a unit of a kind of proto-systems theory provides an interesting model for assemblages of lighting, clusters and pairings of energies. I've seen the monad used, as Leibniz himself did, as a mode of viewing Western musical structures. However, most interesting are attempts to account for not only structure, but also for those timbral sonic parameters neglected by Western musical notation. 

A parallel in lighting could perhaps be found as a starting point. musical structure might be akin to the spatial placement (topological or geometrical) of each light bulb. How they're structured, how they're bound to one another, how (if) they hang, what kind of bulb are we using, wattage, these are all important timbral considerations which define how each monad functions within the larger ecology. 

This is also defined by movement/rhythm, a seemingly critical aspect to identifying relationships between monads (lights). This sort of work has also been done at TML-- my hope is that we can build on this together as we go forth over the next few months. 

Matthew has done some great research into proper thread cables and light bulbs for the space, and has spec'd what promises to make the environment here a much warmer space. He has, with Luke, ordered supplies to create 20-30 lights. We should have them in ~7 business days and we can start throwing lights up. Ideal will be to pair this with the grid beam/ desks. 

Other lighting possibilities are non traditional; Katie purchases an old-school overhead projector from Surplus, which I've hooked up to DMX. Interesting with this might be to find some materials (tissue paper?) which provide an interesting degree of translucence. Perhaps they could (Byron?) be manipulated by some servos. Robotic shadow puppets could also be an interesting endeavor. Byron also mentioned sand...

I would also like to continue to work with RGB LEDs, although there is a general fear of disco @ BY :-) Although the testing with LEDs has been admittedly "disko-orientiert," it would be nice to build them into window boxes. Using a simple pixel selector (thanks Julian), we can use incoming video feeds to detect sky hue and reproduce it in artificial windows (rectangular, oval, asymmetrical, as we please) which do not have to be on walls. 

All best,
Garrett 


On Sun, Jun 22, 2014 at 9:24 AM, Sha Xin Wei<shaxinwei@gmail.com> wrote:
Hi, 

I’d like to propose a research sub-thread on lighting design for the Brickyard commons this Fall
inspired and guided by specific practical and aesthetic intents.

For foveal work, when someone is at a work site 
Lighting either work zones or ceilings with floor lamps
BUT research goal is NOT to beam light onto objects or work surfaces  a la electric era lighting design,
but to emulate and learn from skylight and shafts of sunbeams(preferably natural light*) directed by agencies 
summing both the individual as well as the ambient
(e.g. outdoor sky colour temp => colour temp of interior washes, 
amount of activity in other parts of floor => … 
sound timbre => …. )
Criteria: enlivening, and can serve 
either focus foveal work,or support reverie depending on
the inhabitant’s phenomenological disposition, which can flicker.

For night and default empty (when people leave the commons) lighting 
use cloud of bulbs in amorphous, bunched constellations
soften by animation (ramped brighten / darken) and by slightly intermingling  clusters.

Clusters can brighten over the heads of people where they gather. 
(Use  motion to initiate, presence to stay on…)

Message and actual: even empty of people, the room is still working, 
research is still happening with non-human agencies. 

Below I include a snip from the Fountain to give a sense of volumetric density
and constant shifting of perspective that we can perhaps induce as people move through the space.
This has direct, poetic connection with the conceptual work referencing
Deleuze & Guattari, Leibniz’s monadology, 
Riemannian geometric approaches to design (special volume being prepared).


But I’d like to clearly distinguish this from the iStage blackbox atmosphere, but we can use
theatrical technologies with non-blackbox results.

Of course this will connect with everyone’s work in the BY, 
but who will be responsible for the research direction and push the aesthetic-technical work with me?
Who can own this as a focus?

Cheers,
Xin Wei

* It seems ethico-aesthetically ugly to use electrical light when we have so much sunlight to draw from.  S
It takes careful thought to see how to respond to this subject to our practical time+budget constraints.



-- 
Garrett L. Johnson
Music History TA 
Arizona State University 
School of Music

[Synthesis][TML]: table-portal test (Was: [Portal research stream] opening up the audio-video portals between Synthesis Brickyard and TML studios )

Hi,

This is a good example of the sort of poetic portal behaviours and forms we might explore in a great variety over a brief period of time.

Bravo Evan! :)

Portal = porthole experiment

What I’d like to see us experiment with as well are
multiple pairs of small (20cm) disk portals  with similar wipe => reveal behaviours
set up on 24x7 live streams.  Use cameras and / or sub-regions of video streams that yield restricted views
of only a small field of view.

Then we can explore invention of social protocols for tailoring the topology of gaze / portal via physically angling the cameras and pico-projectors on the spot in live, in situ tests.  
Let’s unbolt projectors and cameras from fixed locations and fixed perspectives!

The point: Deleuze & Guattari’s rhizomes are not simply networks of identical nodes as is commonly understood in mechanical interpretation, but manifolds of perspectival monads that continuously vary, continuously.   And ethico-aesthetic reflexivity demands that we ought to permit us inhabitants to tailor the topological gaze identification ourselves.   This is not for the programmer-engineer-designer to over-think and "solve" ahead of the dwelling.   I’ll relay Luke’s comments as well in subsequent email.

Xin Wei

__________________________________________________________________________________
Professor and Director • School of Arts, Media and Engineering • Herberger Institute for Design and the Arts / Director • Synthesis / ASU
Founding Director, Topological Media Lab / topologicalmedialab.net/  /  skype: shaxinwei / +1-650-815-9962
__________________________________________________________________________________


Begin forwarded message:

From: Michael Montanaro <michael.montanaro@concordia.ca>
Subject: Test table
Date: June 21, 2014 at 12:20:39 PM PDT
To: Xin Wei Sha <Xinwei.Sha@asu.edu>, Sha Xin Wei <shaxinwei@gmail.com>, "Evan Montpellier" <evan.montpellier@gmail.com>

Have a look. Will test live portal tomorrow with Niko in Greece. 



take care
m


Michael Montanaro, Chair
Department of Contemporary Dance
Concordia University
Artist/Researcher /  HEXAGRAM 
Centre for Research-Creation in Media Arts and Technologies