RE: network of motors for "Theater of Things" project

From what I know so far, I am thinking of using a teensy/xbee based hardware where part of the data processing can be partially distributed to the arm control objects and wireless connectivity will be accomplished using zigbee protocol. This way,

Assegid Kidané

Engineer

School of Arts, Media and Engineering

Herberger Institute for Design and the Arts | Fulton School of Engineering

Arizona State University

Box 875802

Tempe, AZ 85287-5802

V 480.965.6127 M 480 306 2686

F 480.965.4345

assegid@asu.edu

http://ame.asu.edu

  if and when the project grows to add more arms and the programming involves more intricate movements we will not have to be concerned about bandwidth issues with the coordinator module. Instead of sending large detailed direct commands from the PC we can send short codes that the individual objects will interpret. Julian, what is your preferred osc control format you would like to use?

From: Xin Wei Sha
Sent: Thursday, September 17, 2015 7:33 PM
To: Assegid Kidane
Cc: Byron Lahey; Christopher Roberts; Julian Stein; Christopher Zlaket; post@synthesis.posthaven.com
Subject: network of motors for "Theater of Things" project

Hi Ozzie,

OK to follow up on our chat today.

It turns out that Zlaket’s Synthesis research time is already dedicated by to work with Chris Ziegler on Ziegler's motorized lights for forest3.  You may be already consulted on that :)

However I think we must not wait for that project to finish before getting a move on the "Place and Atmosphere: Space Design: Theater of Things” research project.

We can start with a provisional mechatronic foundation for the robotic animation of small lightweight objects like pieces of cardboard, ribbon, flowers, mirrors.   This would extend the Ozone platform to mechatronics that can be animated using logics from the media processing.

Would you be interested and available to do the first step: develop a small test network of 3 small motors that can be addressed from the Ozone system using an OSC protocol that Julian can decide (in consultation with you and Todd)?     I think it would smart for us to consult with Byron because I know he’s already thinking about streamlining his hw. 

Later we can discuss good “default” or initial behaviors some of which Byron already implemented two summers ago: imitation, delay, antisymmetry, repeat, reverse, etc. etc.

I’m imagining this should not take very much of your time, but it is of foundational research importance, as new element of our responsive environment toolkit.
Of course, this assumes that you have the time to take this on before the establishment of the new electronics labs downstairs in the former TV Studios.

We can touch base on the first goals maybe with Byron before the AME General Friday or at a Tea...
Xin Wei


________________________________________________________________________________________
Sha Xin Wei • Professor and Director • School of Arts, Media and Engineering + Synthesis
Herberger Institute for Design and the Arts + Fulton Schools of Engineering • ASU
Fellow: ASU-Santa Fe Center for Complex Biosocial Systems
Affiliate Professor: Future of Innovation in Society; Computer Science; English
Founding Director, Topological Media Lab
skype: shaxinwei • mobile: +1-650-815-9962
_________________________________________________________________________________________________

Re: network of motors for "Theater of Things" project

I'm happy to discuss this tomorrow. I share Xin Wei's enthusiasm for seeing physical motion as a fundamental component of our systems. The main thing we need to do is decide what features to prioritize, then we can select hardware for prototypes. 

Cheers,
Byron

On Thu, Sep 17, 2015 at 7:33 PM, Xin Wei Sha <Xinwei.Sha@asu.edu> wrote:
Hi Ozzie,

OK to follow up on our chat today.

It turns out that Zlaket’s Synthesis research time is already dedicated by to work with Chris Ziegler on Ziegler's motorized lights for forest3.  You may be already consulted on that :)

However I think we must not wait for that project to finish before getting a move on the "Place and Atmosphere: Space Design: Theater of Things” research project.

We can start with a provisional mechatronic foundation for the robotic animation of small lightweight objects like pieces of cardboard, ribbon, flowers, mirrors.   This would extend the Ozone platform to mechatronics that can be animated using logics from the media processing.

Would you be interested and available to do the first step: develop a small test network of 3 small motors that can be addressed from the Ozone system using an OSC protocol that Julian can decide (in consultation with you and Todd)?     I think it would smart for us to consult with Byron because I know he’s already thinking about streamlining his hw. 

Later we can discuss good “default” or initial behaviors some of which Byron already implemented two summers ago: imitation, delay, antisymmetry, repeat, reverse, etc. etc.

I’m imagining this should not take very much of your time, but it is of foundational research importance, as new element of our responsive environment toolkit.
Of course, this assumes that you have the time to take this on before the establishment of the new electronics labs downstairs in the former TV Studios.

We can touch base on the first goals maybe with Byron before the AME General Friday or at a Tea...
Xin Wei


________________________________________________________________________________________
Sha Xin Wei • Professor and Director • School of Arts, Media and Engineering + Synthesis
Herberger Institute for Design and the Arts + Fulton Schools of Engineering • ASU
Fellow: ASU-Santa Fe Center for Complex Biosocial Systems
Affiliate Professor: Future of Innovation in Society; Computer Science; English
Founding Director, Topological Media Lab
skype: shaxinwei • mobile: +1-650-815-9962
_________________________________________________________________________________________________

network of motors for "Theater of Things" project

Hi Ozzie,

OK to follow up on our chat today.

It turns out that Zlaket’s Synthesis research time is already dedicated by to work with Chris Ziegler on Ziegler's motorized lights for forest3.  You may be already consulted on that :)

However I think we must not wait for that project to finish before getting a move on the "Place and Atmosphere: Space Design: Theater of Things” research project.

We can start with a provisional mechatronic foundation for the robotic animation of small lightweight objects like pieces of cardboard, ribbon, flowers, mirrors.   This would extend the Ozone platform to mechatronics that can be animated using logics from the media processing.

Would you be interested and available to do the first step: develop a small test network of 3 small motors that can be addressed from the Ozone system using an OSC protocol that Julian can decide (in consultation with you and Todd)?     I think it would smart for us to consult with Byron because I know he’s already thinking about streamlining his hw. 

Later we can discuss good “default” or initial behaviors some of which Byron already implemented two summers ago: imitation, delay, antisymmetry, repeat, reverse, etc. etc.

I’m imagining this should not take very much of your time, but it is of foundational research importance, as new element of our responsive environment toolkit.
Of course, this assumes that you have the time to take this on before the establishment of the new electronics labs downstairs in the former TV Studios.

We can touch base on the first goals maybe with Byron before the AME General Friday or at a Tea...
Xin Wei


________________________________________________________________________________________
Sha Xin Wei • Professor and Director • School of Arts, Media and Engineering + Synthesis
Herberger Institute for Design and the Arts + Fulton Schools of Engineering • ASU
Fellow: ASU-Santa Fe Center for Complex Biosocial Systems
Affiliate Professor: Future of Innovation in Society; Computer Science; English
Founding Director, Topological Media Lab
skype: shaxinwei • mobile: +1-650-815-9962
_________________________________________________________________________________________________

Email from Chris R.

Just to fill you in on Pete and my’s discussion regarding the physical setup. 


We believe given the spatial and material constraints of the room and our system that we can setup 2 immersive/interactive projection surfaces within the truss, and one more large projection outside of it. This will require buying ~300$ worth of cloth to project onto.

Our thinking is that we will have to project on the floor and one wall of the truss, with images that can fill up the whole “screen”. We also think we can mount a third projector to show data and code, that we will project onto a wall outside of the truss system. This is achievable with the technology we currently have, and will keep us within the spatial limitations of the room we were assigned.

Does this sound alright?

Pete, I realized when I was walking to the car that the lounge has an IR dragonfly, with an emitter (and I thought fish-eye lens) that we were using for ToC. Do you think they are still up there? Are they being used?

Chris

Kristi- please send to post haven


Christopher M Roberts
Assistant Research Professor
School of Arts, Media + Engineering
Arizona State University

Plans for the upcoming complex systems conference

First off, thank you very much for coming out today and helping us make plans for the upcoming complex systems conference.  

As XW has mentioned given the tight timeline we need to work in parallel with each other, not on top of each other. So I am sketching out a basic action plan to keep things running smoothly.

Can we please follow this plan of action for the next few days:

  1. I would like to divide the group up into three teams: 1) code (XW (lead), Josh,  Connor, Julian); 2) physical (Pete (lead), Megan, Zlaket); Concept (ChrisR (lead), Cooper, XW).
  2. Team Code. Can you start implementing your discussion today on phase changes etc. as soon as is convenient. Please use the full iStage system to start your implementation and do not wait for the truss to be completed. I wasn’t there for most of this, but I am hoping that this will involve working on instruments that will be interactive and projected on the floor and one “wall” of the truss. In the iStage this means use the floor and the scrim left from SERRA. We will optimize later. I believe we agreed on doing a demo on Monday, so let’s make that a target date for showing me and XW what you have accomplished
  3. Team Physical. Design, implement, and fine tune the truss system to the best of your ability with the resources we have at hand. At minimum this means installing a floor, speakers, projectors, IR emitter, cameras, lights and (I selfishly hope) a microphone into the truss system, along with a series of cables etc. Pete let’s you and I touch base about equipment in the morning sometime to make sure we have what we need.
  4. Team Conceptual. Cooper let’s you and I talk about unifying the whole experience, come up with some ideas and present them to XW for approval so that he can focus on helping Team Code. Our goal will be to put compelling aesthetics in the materials Team Code creates, which will be displayed by Team Physical’s apparatus. This means I will need you to come to the Monday demo (which is good with your schedule right?) and if you could be present at 3:30 tomorrow to shadow us in a second meeting that would be nice. Touch base with me tomorrow for more details and context.
Let me know if you have any questions or concerns. 

Thanks again

Chris

Christopher M Roberts
Assistant Research Professor
School of Arts, Media + Engineering
Arizona State University