getting rhythms into Ozone OSC via Julian's rhythm kit

I’d say the exercise would be to plug M Bateman’s drum kit — or any MIDI drum pad that AME may have in its DC pile — to a Mac set aside for rhythm work as Garrett suggested.  

M Bateman:  Run my scratch patch on the Mac that Ozzie (or Pete) assign for rhythm work in iStage.  Tweak my dummy code to send the MIDI to some of Julian’s lighting control patch’s parameters.    You can ask Pete or any of the ultra-friendly AME / SC media artist-researchers for tips.   Or one of us (Pete or Garrett or ...) can do that for you.  Mapping MIDI note (or channel) to DMX channel  and the scaled velocity to light intensity would be a good and very very easy start.  (If the hardware is plugged into the Ozone lighting control, it should be a 5-10 minute exercise.)


Then we can talk —

I’d take you through  some simple ways to take a sequence and play it back  —
with delay, reverse, echo and loop etc.  Hint [ zl group ]  and some other [ zl ]  objects for Max’s standard list manipulation,  together with Max’s delay object, etc.  could be a start ….

But I’m sure Julian’s got many many such logics already — so it may be most efficient to simply schedule a G+ for Bateman, Julian and one local Max artist like Garrett or you
with screenshare.


Again the key is simply to do the easy step of connecting Bateman’s drum kit to iStage lights.  Everything else will follow in small easy steps with other folks doing the other mappings to sound and video etc.  with existing instruments. 

So we would have five ways of getting rhythms into the iStage network (Ozone)

1) video cam feed, 
2) mic feed, 
3)  drums would be third.
4) accelerometers from Chris Z’s iPods, maybe ?
5) accelerometers from Mike K’s kit


Omar, can you and Julian make sure you can use everything Julian’s latest shiny kit?  So you can bring Julian’s refinement to Phoenix ?   

I’d like to dedicate Monday morning to plugging cables and code together to see that we can get these different ways of getting rhythm into the iStage Ozone network, mapped to the lights.

Then try it out briefly with Mike K and dancers at 5:30, after Chris Z is finished.

This is just a test parallel to the Lighting Ecology: Darkener and Lighteners game. 

BTW, TouchDesigner would get in the way of getting rhythm exploration going.  And we don’t need it since we’re working collectively with other talents at AME + Synthesis + TML.

Onward!
Xin Wei


On Nov 9, 2014, at 6:33 PM, Peter Weisman <peter.weisman@asu.edu> wrote:

Hi All,
 
I will need a little guidance for this Thursday. I think I know how. But, I am open to any suggestions.
 
I will make arrangements to be available next Monday at 5:30pm.
 
I will be hanging the lights as soon as they get in this week. I will try to make myself available as much as I can.
 
Pete
AME Technical Director
(480)965-9041(O)
 
From: Xin Wei Sha 
Sent: Sunday, November 09, 2014 8:19 AM
To: Michael Krzyzaniak; Julie Akerly; Jessica Rajko; Varsha Iyengar (Student); Ian Shelanskey (Student); Michael Bateman; Peter Weisman; Garrett Johnson
Cc: synthesis-operations@googlegroups.com; Julian Stein; Assegid Kidane; Sylvia Arce
Subject: Adrian Freed: fuelling imagination for inventing entrainments
 
Garrett, Mike K, Julie, Jessica, Varsha, Ian, Bateman, Pete, 
 
Can we schedule time to create and try out some entrainment exercises during the LRR Nov 17 -26?
 
How about we make our first working session as a group on
 
Monday Nov 17 5:30 in the iStage.
 
I’ll work with people in smaller groups.
 
Tuesday afternoon Nov 12, I’ll putter around Brickyard,
or by appointment in the iStage.
 
Ian, Mike Bateman, Pete: let’s run through the lighting controls as Julian gives them to us
Let’s look at Julian’s Jitter interface together.
I asked Omar to prep a video to video mapper patch that he, you and I can 
modify during the workshop to implement different rhythm-mappings.
 
Pete can map Bateman's percussion midi into the lights (it’s a snap via our Max :)
Thursday 11/14, at 3:30  in iStage ?
 
Garrett, Mike K: can you work with Pete (or Ozzie re. video) to map the mic and video inputs in iStage into Julian's rhythm kit:
percussion midi
video camera on a tripod,
overhead video,
 
1 or more audio microphone on tripod,
wireless lapel microphones. 
Julian has code that maps video into rhythm.
Can we try that out Wed 11/12 or Thurs 11/13 in the iStage?
 
 
Let’s find rich language about rhythm, entrainment, temporality, processuality in movement and sound.
relevant to entrainment and co-ordinated rhythm, as the Lighting and Rhythm workshop looms.  
 
On Nov 8, 2014, at 11:54 AM, Xin Wei Sha <Xinwei.Sha@asu.edu> wrote:
I would like to enrich our imagination before we lock too much into talking about “synchronization” 
and regular periodic clocks in the Lighting and Rhythm workshop.
 
 
On Nov 8, 2014, at 9:41 PM, Adrian Freed <Adrian.Freed@asu.edu> wrote: