OSC vs Web-RTC reflector in AWS cloud

A couple of things about OSC and Web-RTC.   This may seem like a technical consideration, but the lived experiences of  — communities of  —  creator or researcher or performer weighs in here…

(1)  For realtime responsive media in live event, we work under the assumption that our processes process live gesture & activity with adequately dense streams of data,    We like lowest possible latency, want our stream rates as high as emitted, don’t care about dropped frames, don’t care about irregular timing.  And we let downstream processes, including the human performer deal.  This works well when there’s adequately high density — frames of data / unit time, and live human-in-the-loop action. (… thanks to Joel Ryan using wireless sensor platforms for live performance between Vienna and Brussels.)    (Yes re. Vangelis we can lean on John Maccallum’s experience :)   

Benefit of UDP is the fact that some packets can be dropped/skipped/undelivered, as it is not "mandatory" data, like picture frames or piece of sound, that only results on reduced quality, which is "expected" behaviour in streaming world. WebSockets will ensure that data is delivered, and "price" for it is mechanics that at the end slows down the queue of packets, to ensure ordered and guaranteed delivery of data.

General principle, for live responsive media:  let human deal with variation of streaming media flow as part of the materiality of the medium (in this case network transport dynamics) rather than hide that materiality  under homogeneous behavior at the cost of slowing processing, introducing global latencies, or syntactic complexity.

(2) Syntactic complexity
For backward compatibility with a lot of code and coding practice, it’d be very important not to require application programmers to insert OSC - web socket glue code in legacy frameworks.  Technicalities or extra idioms whose adoption may seem trivial to an expert programmer can be an insurmountable barrier to creative coding in communities of researchers and artists who just want to get on with shaping the event.

(3)
As much as possible, I’d like to  talk to my cloud services independent from the web (httpd).
Our apps should talk via internet without going through any browser.   
OSC was invented for streaming sensor data between apps, and to be transport agnostic.  
And all the work we’ve been doing with responsive media for live event is aimed at 
avoiding the chunky clunky document paradigms introduced by HTML / httpd
(from the era of file-based OS familiar to CERN engineers :).   

One rejoinder may be that contemporary network transport quality renders (1) and (3) moot, but this assumption is false when we work in parts of the world that are farther from tightly managed network infrastructures.

And (2) is vital.