“Live Coding extends(Vision Factory)” Workshop

7 January 2009 at 11:56 am (3D, First ones, Hypermedia, Live Coding, openframeworks, Processing.org, School of Art)

2008 ended with style : a full week workshop around Julien V3GA‘s Vision Factory API.

LiveCoding is an extension of Vision Factory framework (used by Julien V3GA in his studio for professional purposes and for Vjing in a personal purpose) and exists in two different “beta”-versions : 
● On a single computer;
● Up to 5 computers (Up to 4 computers using Processing plus an extra one where the API actually runs).

In order to use Vision Factory, you have to edit a javascript file where you enter your code lines. When you save the javascript file, SpiderMonkey re-interprets the script and sends a “bug report” to Vision Factory. If no bugs were found, the script is validated and displayed, else the previous script is kept running and Vision Factory stands by for a new/corrected bugs-free version of the JavaScript file.

We use pre-programmed Vision Factory functions as base structures for the javascript files. The most basic ones are equivalent to the Processing‘s or OpenFramework‘s “void setup()”, “void update()” and “void draw()” functions. The init() -equivalent of setup()- is run only once, but it can be recalled in any moment by the programmer. Both update() and render() -equivalent of draw()- are loops and work like in OpenFrameworks. Here you can find some new features recently added by Julien V3GA.


Video from Julien V3GA

Another characteristic of Vision Factory is the fact that it was built on a layer structure concept. Vision Factory has a built-in OSC protocol management. It is simple to get access to layer proprieties and to change them using OSC messages. In the network version, each computer gets assigned to a layer (their javascript is assigned to the specific layer). Using several computers, each computer receives a layer, and like in photoshop, the higher layers mask the lower ones. Each computer runs a Processing client that sends the script whenever it is saved, Vision Factory receives them and treats them like the non-network version. Finally, we can control Vision Factory’s final render : splitting the screen, giving a render for each JavaScript or assembling the different layers into one unique screen, having superpositions from the different scripts.

Vision Factory is currently not released.

LiveCoding

Julien V3GA also showed us an iPhone/iPod Touch application called Mrmr that he uses when he makes some Vjing stuff. The aplication allows one to configure an interface and send data using Open Sound Control. The devices (iPhone/iPod touch) interface (multitouch screen) is highly adapted to control sound and visualization’s parameters (like a MIDI controller) with the advantage is that you can do it wireless.

Considering all this, I am more then willing to start a project to implement Open Sound Control into a Blackberry mobile phone (if someone heard about an opensource project on this direction, please leave me a comment presenting this project). The advantage of a Blackberry over other phones is the full (and comfortable) QUERTY/AZERTY keyboard. Allowing you to type fast enough to make LiveCoding. The first stage is to implement OSC, then to make a programming interface (the built-in notepad?) to finally add the command features (connect to server, reload the “setup()” function of the script, disconnect, open a script, etc).

Well, I believe that I have some work to do now…

Permalink 2 Comments