getting rhythms into Ozone OSC via Julian's rhythm kit

I’d say the exercise would be to plug M Bateman’s drum kit — or any MIDI drum pad that AME may have in its DC pile — to a Mac set aside for rhythm work as Garrett suggested.  

M Bateman:  Run my scratch patch on the Mac that Ozzie (or Pete) assign for rhythm work in iStage.  Tweak my dummy code to send the MIDI to some of Julian’s lighting control patch’s parameters.    You can ask Pete or any of the ultra-friendly AME / SC media artist-researchers for tips.   Or one of us (Pete or Garrett or ...) can do that for you.  Mapping MIDI note (or channel) to DMX channel  and the scaled velocity to light intensity would be a good and very very easy start.  (If the hardware is plugged into the Ozone lighting control, it should be a 5-10 minute exercise.)


Then we can talk —

I’d take you through  some simple ways to take a sequence and play it back  —
with delay, reverse, echo and loop etc.  Hint [ zl group ]  and some other [ zl ]  objects for Max’s standard list manipulation,  together with Max’s delay object, etc.  could be a start ….

But I’m sure Julian’s got many many such logics already — so it may be most efficient to simply schedule a G+ for Bateman, Julian and one local Max artist like Garrett or you
with screenshare.


Again the key is simply to do the easy step of connecting Bateman’s drum kit to iStage lights.  Everything else will follow in small easy steps with other folks doing the other mappings to sound and video etc.  with existing instruments. 

So we would have five ways of getting rhythms into the iStage network (Ozone)

1) video cam feed, 
2) mic feed, 
3)  drums would be third.
4) accelerometers from Chris Z’s iPods, maybe ?
5) accelerometers from Mike K’s kit


Omar, can you and Julian make sure you can use everything Julian’s latest shiny kit?  So you can bring Julian’s refinement to Phoenix ?   

I’d like to dedicate Monday morning to plugging cables and code together to see that we can get these different ways of getting rhythm into the iStage Ozone network, mapped to the lights.

Then try it out briefly with Mike K and dancers at 5:30, after Chris Z is finished.

This is just a test parallel to the Lighting Ecology: Darkener and Lighteners game. 

BTW, TouchDesigner would get in the way of getting rhythm exploration going.  And we don’t need it since we’re working collectively with other talents at AME + Synthesis + TML.

Onward!
Xin Wei


On Nov 9, 2014, at 6:33 PM, Peter Weisman <peter.weisman@asu.edu> wrote:

Hi All,
 
I will need a little guidance for this Thursday. I think I know how. But, I am open to any suggestions.
 
I will make arrangements to be available next Monday at 5:30pm.
 
I will be hanging the lights as soon as they get in this week. I will try to make myself available as much as I can.
 
Pete
AME Technical Director
(480)965-9041(O)
 
From: Xin Wei Sha 
Sent: Sunday, November 09, 2014 8:19 AM
To: Michael Krzyzaniak; Julie Akerly; Jessica Rajko; Varsha Iyengar (Student); Ian Shelanskey (Student); Michael Bateman; Peter Weisman; Garrett Johnson
Cc: synthesis-operations@googlegroups.com; Julian Stein; Assegid Kidane; Sylvia Arce
Subject: Adrian Freed: fuelling imagination for inventing entrainments
 
Garrett, Mike K, Julie, Jessica, Varsha, Ian, Bateman, Pete, 
 
Can we schedule time to create and try out some entrainment exercises during the LRR Nov 17 -26?
 
How about we make our first working session as a group on
 
Monday Nov 17 5:30 in the iStage.
 
I’ll work with people in smaller groups.
 
Tuesday afternoon Nov 12, I’ll putter around Brickyard,
or by appointment in the iStage.
 
Ian, Mike Bateman, Pete: let’s run through the lighting controls as Julian gives them to us
Let’s look at Julian’s Jitter interface together.
I asked Omar to prep a video to video mapper patch that he, you and I can 
modify during the workshop to implement different rhythm-mappings.
 
Pete can map Bateman's percussion midi into the lights (it’s a snap via our Max :)
Thursday 11/14, at 3:30  in iStage ?
 
Garrett, Mike K: can you work with Pete (or Ozzie re. video) to map the mic and video inputs in iStage into Julian's rhythm kit:
percussion midi
video camera on a tripod,
overhead video,
 
1 or more audio microphone on tripod,
wireless lapel microphones. 
Julian has code that maps video into rhythm.
Can we try that out Wed 11/12 or Thurs 11/13 in the iStage?
 
 
Let’s find rich language about rhythm, entrainment, temporality, processuality in movement and sound.
relevant to entrainment and co-ordinated rhythm, as the Lighting and Rhythm workshop looms.  
 
On Nov 8, 2014, at 11:54 AM, Xin Wei Sha <Xinwei.Sha@asu.edu> wrote:
I would like to enrich our imagination before we lock too much into talking about “synchronization” 
and regular periodic clocks in the Lighting and Rhythm workshop.
 
 
On Nov 8, 2014, at 9:41 PM, Adrian Freed <Adrian.Freed@asu.edu> wrote:
 

Adrian Freed: fuelling imagination for inventing entrainments

Here’s a note relevant to entrainment and co-ordinated rhythm, as the Lighting and Rhythm workshop looms.

Adrian Freed’s collected a list of what he calls “semblance typology of entrainments.”
Notice that he does NOT say “types” but merely a typology of semblances, which helps us avoid reification errors.

Let’s think of this as a way to enrich our vocabulary for rhythm, entrainment, temporality, processuality in movement and sound.
Let’s not use this — or any other list of categories — as an absolute universal set of categories sans context. 
See the comments below.


On Nov 8, 2014, at 11:54 AM, Xin Wei Sha <Xinwei.Sha@asu.edu> wrote:
I would like to enrich our imagination before we lock too much into talking about “synchronization” 
and regular periodic clocks in the Lighting and Rhythm workshop.




On Nov 8, 2014, at 10:01 PM, Adrian Freed <Adrian.Freed@asu.edu> wrote:
I haven't thought about this in a while but the move to organize the words using the term "together" which I did for the talk at Oxford is interesting because it allows a formalization in mereotopology a la Whitehead but I would have to provide an interpretation of enclosure and overlap that involves  correlation metrics in some structure, for example, CPCA 
(Correlational Principal Component Analysis): http://www.lv-nus.org/papers%5C2008%5C2008_J_6.pdf



On Nov 9, 2014, at 7:05 AM, Sha Xin Wei <shaxinwei@gmail.com> wrote:


__________________________________________________________________________________
Sha Xin Wei, Ph.D. • xinwei@mindspring.com • skype: shaxinwei • +1-650-815-9962
__________________________________________________________________________________



Thanks Adrian,

Then I wonder if rank statistics —  ordering vs cardinal metrics — could be a compromise way.
David Tinapple and Loren Olson here have invented a web system for peer-critique called CritViz
that has students rank each other’s projects.  It’s an attention orienting thing…

Of course there are all sorts of problems with it — the most serious one being 
herding toward mediocrity or at best herding toward spectacle

and it is a bandage invented by the necessity of dealing with high student / teacher ratios in studio classes.

The theoretical question is can we approximate a mereotopology on space of 
Whiteheadian or Simondonian  processes using rank ordering 
which may do away with the requirement for coordinate loci.

Axiom of Choice gives us a well ordering on any set, so that’s a start, 
but there is no effective decidable way to compute an ordering for an arbitrary set.
I think that’s a good thing.   And it should be drilled into every engineer.
This means that the responsibility for ordering 
shifts to ensembles in milieu rather than individual people or computers.

Hmmm, so where does that leave us?

We turn to our anthropologists, historians, 
and to the canaries in the cage — artists and poets…

There’s a group of faculty here including  Cynthia Selin, who are doing what they call scenario [ design | planning | imagining ]
as a way for people to deal with wicked messy situations like climate change, or developing economies.   They seem very prepared for 
narrative techniques applied to ensemble events but don’t know anything about theater or performance.  
It seems like a situation  ripe for exploration.  If we can get sat the naive phase of slapping conventional narrative genres from
community theater or gamification or info-visualization onto this.

Very hard to talk about, so I want to build examples here.

Xin Wei

"correlation" experiments with Mike K et al ( & Xin Wei ) during Lighting and Rhythm Workshop

Dear Mike , Julie, Varsha, Pavan,

Everyone: Beginning to carry out some "correlation” experiments as core scientific research of the LRR is a very high priority for Synthesis.  So thanks to Mike for dong this.  (We’ll want to continue this through the year until we generate some publishable insights. :)

I would  like to make sure that Pavan is in the loop here with his group.   The goal for "correlation experiments” during the LRR Nov 17-26 is pretty concrete.   We want to discover some non-isomorphic relations between co-movement, movement by ensembles (or individuals) that those people themselves claim are 

(1) “correlated”, mutually responding in some way, or
(2) anticipate or retrospect activity,

either among the humans or the light fields.

(1) Julie and Varsha and Mike can go through some movement “games” / studies responding to the questions laid out in the Lighting and Rhythm workshop .  See the Etudes Motivating Questions  .

(2) Xin Wei, Pavan and Pavan’s group can witness this work, talk with Julie Mike and Varsha,
review videos simultaneously with streams of data as well as correlation 
and look at Mike’s data from accelerometers and correlation computations.
Mike’s recorded data streams concurrent with video of activity. 

( Hint:  QT Screen-recording off of Jitter patches does the job: 
a lesson for TMLabbers who would like to do parallel research in this domain :)


Chris R : I’ll make myself available after 5:30 or on the weekends too, to fit Julie, Mike Varsha’s availability.

Xin Wei

________________________________________________________________________________________
Sha Xin Wei • Professor and Director • School of Arts, Media and Engineering + Synthesis
Herberger Institute for Design and the Arts + Fulton Schools of Engineering • ASU
skype: shaxinwei • mobile: +1-650-815-9962
Founding Director, Topological Media Lab •  topologicalmedialab.net/
_________________________________________________________________________________________________



On Nov 7, 2014, at 3:29 PM, Julie Akerly <j.akerly@live.com> wrote:

Yes, I would really like to be a part of this, but things during the weekdays do not work for me.
Can you schedule some workshop times that are after 5:30 or on the weekends?

Julie Akerly
Co-Director of NueBox
Artistic Director of J.A.M.

The dancer's body is simply the luminous manifestation of the soul.
~Isadora Duncan




Date: Fri, 7 Nov 2014 15:12:52 -0700
Subject: Meetings with Xin Wei during Rhythm and Lighting Workshop
From: mkrzyzan@asu.edu
To: cmrober2@asu.eduj.akerly@live.comvarshaiyengar@hotmail.comshaxinwei@gmail.com

Chris,

Xin Wei wanted me to ask you if we can schedule time together in iStage during the Rhythm and Lighting workshop to work on dancer correlation. He suggested the Tuesday and Thursday afternoons (Nov 18, 20, 25, 27), and possibly on the afternoon of the 24th when Michael Montanaro is here.

Julie and Varsha, what are your schedules like at these times? Julie, I know you work during the day, but we can work around your schedules, both of you...

Mike 

-- 
Sine coffa, vita nihil est!!!

Re: Door Portals : prototypes delivered by Nov 15, bomb tested by Nov 21, and deployed by Nov 24 for WIP event

To be clear,
What happens inside the frame of the Door in terms of visual effect, beyond the DEFAULT behaviours
(0) Mirror
(1) Live streams: Stauffer Reception, iStage, Brickyard Commons

is really up to the ingenuity of the realtime video artists, starting with Prof. David Tinapple.
David can curate other realtime video “instruments” for Dec 2 or beyond.
All we need for Dec 2 is the realtime feed with blend between them.

Anything beyond that, such as David’s clever “gaze” tracking is icing on the cake.  That would be fantastic.

David (or Ozzie or Garrett):
Can you look at Ozzie’s code for the Stauffer display, and tweak it to serve as a robust SHELL program?
We need a good shell program that will switch among video instruments, either on a bang
(e.g. from a handclap so we should put in audio with calibration).  The video instruments will
be other students’ works.

Xin Wei

________________________________________________________________________________________
Sha Xin Wei • Professor and Director • School of Arts, Media and Engineering + Synthesis
Herberger Institute for Design and the Arts + Fulton Schools of Engineering • ASU
skype: shaxinwei • mobile: +1-650-815-9962
Founding Director, Topological Media Lab •  topologicalmedialab.net/
_________________________________________________________________________________________________



On Nov 1, 2014, at 10:44 AM, Xin Wei Sha <Xinwei.Sha@asu.edu> wrote:

Dear Pete, David, Nicole, (technical advisor Ozzie)

I would like Pete to be in charge of installations for WIP Dec 2 following my event design — (For the dramaturgy I’ll of course rely on experienced folks — Chris Z as well as Pete, Garth, Todd, Ed, David.)

For Dec 2, I would like monitors mounted in portrait on some walls in Stauffer and iStage: hence the name DOOR PORTALS, DoorPorts

BEHAVIOR

Each single-sided Door Portal has a camera facing out from its front.

In iStage: when you stand 20’ away from the Door Portal  you see live feed of the receptionist desk at Stauffer.   When there is noone present, the DoorPortal should show past people, faintly.

As you occlude the Door Portal (background subtract, use tml.jit.Rokeby), it turns into a mirror.
   As you walk toward this Door Portal, at 15' the mirror fades to live feed from another Door Portal B
At 10’ the live feed fades to live feed from kn Door Portal C


LOCATIONS, SIZES

(1) In iStage in the location that we discussed — next to the door to offices, flush with the floor.  Largest size.

(2) On a wall in Stauffer B, facing the reception desk  (in addition or instead of the display behind receptionist’s head.)
Cables must be hidden — we cannot have exposed cable in the receptionist area.  Let the Space committee take note.

(3) Mounted on an wall in the dark Commons of the BY.

All these should be PORTRAIT — no landscape, please — let’s avoid knee-jerk screenic response!   Bordered images are boring.

NICOLE, your idea for prototyping is very smart.   if you want, please feel free to make paper mockups — just 
ask Ozzie or Pete for dimensions of our monitors, cut paper to 1-1 scale, and bring some mock doors that we can pin to the walls in the BY and iStage.

Re. scale, I’m ok with the doors not human height — 1-1 body size is too literal (= boring).  Plus I want children-sized apparatus in our office spaces.  Our office spaces are all designed for 5-6’ tall people, nominally adults — this is socially sterile.

NOTE: I think we do NOT need a big screen on the foyer — it is too expensive and a waste of a monitor (unless CSI buys one :)  
For that Brickyard foyer, if people — e.g. Chris R — wants a porthole (≠ portal) then let’s put a pair of iPads one on each side of that foyer sub wall, and stream live video between them.


TIMETABLE (ideal)

Nov 15
prototypes delivered

Nov 21
bomb tested, 

Nov 24
Deployed for WIP event
(If is documentation day for LRR, then this is when we should invite Sarah H to send her team to shoot the Door Portal along with the best of LRR for Herberger (gratis!))


________________________________________________________________________________________
Sha Xin Wei • Professor and Director • School of Arts, Media and Engineering + Synthesis
Herberger Institute for Design and the Arts + Fulton Schools of Engineering • ASU
skype: shaxinwei • mobile: +1-650-815-9962
Founding Director, Topological Media Lab •  topologicalmedialab.net/
_________________________________________________________________________________________________

Door Portals : prototypes delivered by Nov 15, bomb tested by Nov 21, and deployed by Nov 24 for WIP event

Dear Pete, David, Nicole, (technical advisor Ozzie)

I would like Pete to be in charge of installations for WIP Dec 2 following my event design — (For the dramaturgy I’ll of course rely on experienced folks — Chris Z as well as Pete, Garth, Todd, Ed, David.)

For Dec 2, I would like monitors mounted in portrait on some walls in Stauffer and iStage: hence the name DOOR PORTALS, DoorPorts

BEHAVIOR

Each single-sided Door Portal has a camera facing out from its front.

In iStage: when you stand 20’ away from the Door Portal  you see live feed of the receptionist desk at Stauffer.   When there is noone present, the DoorPortal should show past people, faintly.

As you occlude the Door Portal (background subtract, use tml.jit.Rokeby), it turns into a mirror.
   As you walk toward this Door Portal, at 15' the mirror fades to live feed from another Door Portal B
At 10’ the live feed fades to live feed from kn Door Portal C


LOCATIONS, SIZES

(1) In iStage in the location that we discussed — next to the door to offices, flush with the floor.  Largest size.

(2) On a wall in Stauffer B, facing the reception desk  (in addition or instead of the display behind receptionist’s head.)
Cables must be hidden — we cannot have exposed cable in the receptionist area.  Let the Space committee take note.

(3) Mounted on an wall in the dark Commons of the BY.

All these should be PORTRAIT — no landscape, please — let’s avoid knee-jerk screenic response!   Bordered images are boring.

NICOLE, your idea for prototyping is very smart.   if you want, please feel free to make paper mockups — just 
ask Ozzie or Pete for dimensions of our monitors, cut paper to 1-1 scale, and bring some mock doors that we can pin to the walls in the BY and iStage.

Re. scale, I’m ok with the doors not human height — 1-1 body size is too literal (= boring).  Plus I want children-sized apparatus in our office spaces.  Our office spaces are all designed for 5-6’ tall people, nominally adults — this is socially sterile.

NOTE: I think we do NOT need a big screen on the foyer — it is too expensive and a waste of a monitor (unless CSI buys one :)  
For that Brickyard foyer, if people — e.g. Chris R — wants a porthole (≠ portal) then let’s put a pair of iPads one on each side of that foyer sub wall, and stream live video between them.


TIMETABLE (ideal)

Nov 15
prototypes delivered

Nov 21
bomb tested, 

Nov 24
Deployed for WIP event
(If is documentation day for LRR, then this is when we should invite Sarah H to send her team to shoot the Door Portal along with the best of LRR for Herberger (gratis!))


________________________________________________________________________________________
Sha Xin Wei • Professor and Director • School of Arts, Media and Engineering + Synthesis
Herberger Institute for Design and the Arts + Fulton Schools of Engineering • ASU
skype: shaxinwei • mobile: +1-650-815-9962
Founding Director, Topological Media Lab •  topologicalmedialab.net/
_________________________________________________________________________________________________

Matt Briggs: lighting instrument experiments in everyday space (i.e. Brickyard)



I may have found a way to incorporate Byron's mechatronics to utilize natural light.
I have many laser cut "icicles" of dichromatic acrylic (picture attached)
(i.e  ).

These "icicles" do a great job of capturing light. I could hang many of these from a panel 
which either may be recessed in place of a ceiling panel or hung from a ceiling
panel. This panel could be made of a flexible material that already exists:
( )
or out of a material that is laser cut to become flexible. The panel could be animated from
a number of control points using Byron's mechatronics. This would create an effect similar to the
combination of:
+
= potential light installation
We then could create an array of panels or clusters of panels which could take in rhythm data
via xOSC and output them into movement of the "icicles" positions in space. 

it is important to have a certain amount of light in the space to create a visual equilibrium and
relieve tension in the eyes. These "icicles" will not provide an significant introduction of light in
to the space. However they will produce a refraction of the existing light and a dynamic and non-
intrusive way. If this light is paired with Josh's light tube system or the already existing natural light 
I believe it could create an small amplification of that light but more importantly a psychological affect 
that may have equal weight on the users of the space. I think it is important to have subtle but
aesthetic pleasing sources of light in the workspace and this may function in that regard.

Any thoughts or criticisms?

I could fabricate these panels and install a small cluster by the workshop date.
Byron do you think it would be possible to animate them by this time?
I will leave the bag of "icicles" on your desk at synthesis for reference.

Side note: I leave Monday for the AR2U conference in Ames, Iowa.
So I won't be able to continue on this project until I get back on the 10th.

A follow up,

I just spoke with Byron and hypothesized some potential applications the "icicles."

In the application above we could create movement based of the refraction of the
natural light or interject movement based on applied rhythm. 

We also spoke of using this light refraction as a feedback loop as a form of rhythm.
This application could be applied to the movement in forms beyond the panels using
the natural light as a parameter.

A suggested form could be a desk lamp or a corner extension. The "icicles" 
could be stacked from top to bottom and rotated from the base to create a vertical
or horizontal deflection. The "icicles" could also be installed in a flexible panel in
a more rigid application to give direct control over their movement.

We discussed the fact that the application of the "icicles" needs to be architectural
rather then sculptural. I think this can be achieved in all of these applications
but would like some input on these applications. Unfortunately I cannot find
any examples of the other applications but I can create sketches if it is unclear.

Byron also assured that the mechatronics would work in these scenarios. 

PS (I've attached the images in this email)

thanks,

-matthew briggs
 undergraduate researcher @synthesiscenter

lighting instrument experiments in everyday space (i.e. Brickyard)

I’d like to (re)introduce you and Omar Faleh, who is coming from the TML for the LRR in November.

In parallel to the iStage work, we should push forward on the animation of lighting in the Brickyard by • Bringing natural sunlight into the BY commons (Josh) • Modulating that natural light using Byron’s mechatronics • Make motorized desk lamps using Byron’s mechatronics • Designing rhythms (ideas from Omar, using Julian’s rhythm kit)

Can we see what you have come up with this coming week?

Meanwhile let me encourage you to communicate with Omar to see what you can create by the time of his arrival circa Nov 17.

To be clear, this is distinct from the LRR in the iStage, but as strategically important in that it is the extension of that work into everyday space, which is one of Synthesis’ main research extensions this year.

I’d like to write this work into a straw proposal for external sponsorship of your work… so it’d be great to have some pix soon!

Cheers, Xin Wei

LIGHT-ECOLOGY GAME: Chiaroscuro instrument

I know that Garret, Omar or any of the cc’d Max-perts can do this, but 

Ian and Michael I propose that you do this in Max as an exercise,
consulting with Pete, Garrett and Julian,
document it, and enter it into our github so we can play with it.

Omar, or Garrett can write and share the code implementing the logic described below extracting “rhythm" from the movement detected from a fiducial.  (Ask Qiao if that is possible.)

We’ve discussed the lighting hardware instruments. 
Let’s now propose some behaviours AKA instruments and games:

LIGHT-ECOLOGY GAME: Chiaroscuro instrument

The array of overhead lights are kept at a medium intensity, flickering with a pattern derived by downsampling a video of glowing embers in closeup.  [ Julian’s code should have a patch to downsample  video to DMX, but it takes minutes to write this in Max / Jitter.]

People who walk onto the floor pick up a trackable fiducial. Depending on which one they pick up, they become a Lightener or Darkener.  [ MoCap in Stauffer B123 supports this — I don't know about iStage.  How many fidicuials can be tracked at a time at most? ]

Where they stand their presence brightens or darkens the lamp under which they stand — adds a +/- offset to the lamp value.

Waving the trackable magnifies their effect — multiply the offset by a ratio as a function of the averaged speed.  Calibrate so that it does not take much effort, and make the slide up very short, but slide-down many many samples — so the decay happens over say 60-300 sec.    Maybe onset of movement should increment (or decrement ) immediately, but the movement extends relaxation to the default average level.

Re: [Synthesis] lighting instrument experiments in everyday space (i.e. Brickyard)

Hi everyone,

I spent a few minutes playing (!) with the dichromatic shafts that Matt created. These have a lot of interesting potential. I attached a few photos and a couple of very rough videos (one-handed/no-handed capture on an old phone) just to give a few hints at this potential. It goes without saying that these perceptual experiences are far richer in person. The static images show the effects on the wall of light reflected off of a random pile of these pieces. This static configuration produces a projection that changes shape and color as the sun moves across the sky. While this is cool, it is certainly nothing new. The videos show a couple of quick experiments moving the shafts. Rotating them, one can get shafts of light that move like clock hands. Imagine a large cloud of these on an wall all rotating at their own frequencies (all of which could be variable). Most interesting of all (at least to me) is the effects created by bending the shafts. This allows the light to be focused and spread in organic shapes. With several of these simultaneously bent (not well documented), the effect is quite striking since they all bend slightly differently and distort in a broadly cohesive, but individually independent way. This process could produce the "shafts of light through trees" effect that Xin Wei suggested. I would be very interested in seeing how this works with the light guide that Josh is designing.

Cheers,
Byron



On Thu, Oct 30, 2014 at 1:41 AM, Xin Wei Sha <Xinwei.Sha@asu.edu> wrote:
Following Matt and Byron --

How about 

(0)  Shafts of sun light beaming through the space as if through forest canopy  — How can we focus diffuse sunlight from Josh’s mylar tunnels?

(1)  Accordioned paper lanterns, floor standing,, size of body?
Motorized to turn its head slowly tracking sun like a sunflower, but spilling light ..

(2) A layer of flat clouds of round curved “chips” about 5-10 cm wide, made of frosted diffuser material — or even better cut into intricate lace?
Each cloud could be pretty compact — no more than say 1m wide, so each array would be about 100 pieces at the very most.
Each piece could be suspended on a string.   Some strings are on motors.


Diverse images, none of which capture what I’m imagining.






Lenka Novakova

Suspended 1m wide cast class disks into which she projected video of rivers







On Oct 29, 2014, at 5:56 PM, Matthew Briggs <matthewjbriggsis@gmail.com> wrote:

A follow up,

I just spoke with Byron and hypothesized some potential applications the "icicles."

In the application above we could create movement based of the refraction of the
natural light or interject movement based on applied rhythm. 

We also spoke of using this light refraction as a feedback loop as a form of rhythm.
This application could be applied to the movement in forms beyond the panels using
the natural light as a parameter.

A suggested form could be a desk lamp or a corner extension. The "icicles" 
could be stacked from top to bottom and rotated from the base to create a vertical
or horizontal deflection. The "icicles" could also be installed in a flexible panel in
a more rigid application to give direct control over their movement.

We discussed the fact that the application of the "icicles" needs to be architectural
rather then sculptural. I think this can be achieved in all of these applications
but would like some input on these applications. Unfortunately I cannot find
any examples of the other applications but I can create sketches if it is unclear.

Byron also assured that the mechatronics would work in these scenarios. 

PS (I've attached the images in this email)

thanks,

-matthew briggs
 undergraduate researcher @synthesiscenter

On Wed, Oct 29, 2014 at 4:43 PM, Matthew Briggs <matthewjbriggsis@gmail.com> wrote:
Hello all,

I may have found a way to incorporate Byron's mechatronics to utilize natural light.
I have many laser cut "icicles" of dichromatic acrylic (picture attached)

These "icicles" do a great job of capturing light. I could hang many of these from a panel 
which either may be recessed in place of a ceiling panel or hung from a ceiling
panel. This panel could be made of a flexible material that already exists:
or out of a material that is laser cut to become flexible. The panel could be animated from
a number of control points using Byron's mechatronics. This would create an effect similar to the
combination of:
+
= potential light installation
We then could create an array of panels or clusters of panels which could take in rhythm data
via xOSC and output them into movement of the "icicles" positions in space. 

it is important to have a certain amount of light in the space to create a visual equilibrium and
relieve tension in the eyes. These "icicles" will not provide an significant introduction of light in
to the space. However they will produce a refraction of the existing light and a dynamic and non-
intrusive way. If this light is paired with Josh's light tube system or the already existing natural light 
I believe it could create an small amplification of that light but more importantly a psychological affect 
that may have equal weight on the users of the space. I think it is important to have subtle but
aesthetic pleasing sources of light in the workspace and this may function in that regard.

Any thoughts or criticisms?

I could fabricate these panels and install a small cluster by the workshop date.
Byron do you think it would be possible to animate them by this time?
I will leave the bag of "icicles" on your desk at synthesis for reference.

Side note: I leave Monday for the AR2U conference in Ames, Iowa.
So I won't be able to continue on this project until I get back on the 10th.

thanks,

-matthew briggs
 undergraduate researcher @synthesiscenter

On Sun, Oct 26, 2014 at 6:41 PM, Xin Wei Sha <Xinwei.Sha@asu.edu> wrote:
Dear Matt, Kevin, and Josh,

I’d like to (re)introduce you and Omar Faleh, who is coming from the TML for the LRR in November.

In parallel to the iStage work, we should push forward on the animation of lighting in the Brickyard by
• Bringing natural sunlight into the BY commons (Josh)
• Modulating that natural light using Byron’s mechatronics
• Make motorized desk lamps using Byron’s mechatronics
• Designing rhythms (ideas from Omar, using Julian’s rhythm kit)

Can we see what you have come up with this coming week?

Meanwhile let me encourage you to communicate with Omar to see what you can create by the time of his arrival circa Nov 17.

To be clear, this is distinct from the LRR in the iStage, but as strategically important in that it is the extension of that work into everyday space, which is one of Synthesis’ main research extensions this year.

I’d like to write this work into a straw proposal for external sponsorship of your work… so it’d be great to have some pix soon!

Cheers,
Xin Wei

[Synthesis] lighting instrument experiments in everyday space (i.e. Brickyard)

Following Matt and Byron --

How about 

(0)  Shafts of sun light beaming through the space as if through forest canopy  — How can we focus diffuse sunlight from Josh’s mylar tunnels?

(1)  Accordioned paper lanterns, floor standing,, size of body?
Motorized to turn its head slowly tracking sun like a sunflower, but spilling light ..

(2) A layer of flat clouds of round curved “chips” about 5-10 cm wide, made of frosted diffuser material — or even better cut into intricate lace?
Each cloud could be pretty compact — no more than say 1m wide, so each array would be about 100 pieces at the very most.
Each piece could be suspended on a string.   Some strings are on motors.


Diverse images, none of which capture what I’m imagining.






Lenka Novakova

Suspended 1m wide cast class disks into which she projected video of rivers







On Oct 29, 2014, at 5:56 PM, Matthew Briggs <matthewjbriggsis@gmail.com> wrote:

A follow up,

I just spoke with Byron and hypothesized some potential applications the "icicles."

In the application above we could create movement based of the refraction of the
natural light or interject movement based on applied rhythm. 

We also spoke of using this light refraction as a feedback loop as a form of rhythm.
This application could be applied to the movement in forms beyond the panels using
the natural light as a parameter.

A suggested form could be a desk lamp or a corner extension. The "icicles" 
could be stacked from top to bottom and rotated from the base to create a vertical
or horizontal deflection. The "icicles" could also be installed in a flexible panel in
a more rigid application to give direct control over their movement.

We discussed the fact that the application of the "icicles" needs to be architectural
rather then sculptural. I think this can be achieved in all of these applications
but would like some input on these applications. Unfortunately I cannot find
any examples of the other applications but I can create sketches if it is unclear.

Byron also assured that the mechatronics would work in these scenarios. 

PS (I've attached the images in this email)

thanks,

-matthew briggs
 undergraduate researcher @synthesiscenter

On Wed, Oct 29, 2014 at 4:43 PM, Matthew Briggs <matthewjbriggsis@gmail.com> wrote:
Hello all,

I may have found a way to incorporate Byron's mechatronics to utilize natural light.
I have many laser cut "icicles" of dichromatic acrylic (picture attached)

These "icicles" do a great job of capturing light. I could hang many of these from a panel 
which either may be recessed in place of a ceiling panel or hung from a ceiling
panel. This panel could be made of a flexible material that already exists:
or out of a material that is laser cut to become flexible. The panel could be animated from
a number of control points using Byron's mechatronics. This would create an effect similar to the
combination of:
+
= potential light installation
We then could create an array of panels or clusters of panels which could take in rhythm data
via xOSC and output them into movement of the "icicles" positions in space. 

it is important to have a certain amount of light in the space to create a visual equilibrium and
relieve tension in the eyes. These "icicles" will not provide an significant introduction of light in
to the space. However they will produce a refraction of the existing light and a dynamic and non-
intrusive way. If this light is paired with Josh's light tube system or the already existing natural light 
I believe it could create an small amplification of that light but more importantly a psychological affect 
that may have equal weight on the users of the space. I think it is important to have subtle but
aesthetic pleasing sources of light in the workspace and this may function in that regard.

Any thoughts or criticisms?

I could fabricate these panels and install a small cluster by the workshop date.
Byron do you think it would be possible to animate them by this time?
I will leave the bag of "icicles" on your desk at synthesis for reference.

Side note: I leave Monday for the AR2U conference in Ames, Iowa.
So I won't be able to continue on this project until I get back on the 10th.

thanks,

-matthew briggs
 undergraduate researcher @synthesiscenter

On Sun, Oct 26, 2014 at 6:41 PM, Xin Wei Sha <Xinwei.Sha@asu.edu> wrote:
Dear Matt, Kevin, and Josh,

I’d like to (re)introduce you and Omar Faleh, who is coming from the TML for the LRR in November.

In parallel to the iStage work, we should push forward on the animation of lighting in the Brickyard by
• Bringing natural sunlight into the BY commons (Josh)
• Modulating that natural light using Byron’s mechatronics
• Make motorized desk lamps using Byron’s mechatronics
• Designing rhythms (ideas from Omar, using Julian’s rhythm kit)

Can we see what you have come up with this coming week?

Meanwhile let me encourage you to communicate with Omar to see what you can create by the time of his arrival circa Nov 17.

To be clear, this is distinct from the LRR in the iStage, but as strategically important in that it is the extension of that work into everyday space, which is one of Synthesis’ main research extensions this year.

I’d like to write this work into a straw proposal for external sponsorship of your work… so it’d be great to have some pix soon!

Cheers,
Xin Wei