Realtime visuals – open up the cage with Syphon
I had a hard time finding a headline for this post. The thing is that something is happening in the VJ/visuals world that is getting me very excited. That thing is a new technology on the Mac called Syphon. At this point I think only a small circle of programmers and techies are aware of Syphon as it hasn’t yet been released. I think it’s going to have a tremendous impact on the way realtime visual apps (think VDMX, Modul8, Resolume) and visual programming environments are going to work in the future. Let me try to explain.
Update: As of today November 3rd, there is a public beta for download for Quartz, FreeFrame, Unity 3D and Jitter.
Today each visual app or VJ app is like a closed cage. You create your graphics and it is then sent to your monitor, your video projector or whereever. The output of the app is locked. It cannot be altered after it leaves the app.
Imagine you had two apps each producing a live image. Up until now there has been no good way to combine the outputs of those two apps into one image in real time. Like if you created a background in Resolume and a nice particle effect in Modul8. You would have to save one of them as a file and then import the movie into the other app. Or you would have to use a screen grabber to catch the image from Resolume and put it into Modul8 in real time, but that uses a lot of the ressources of your computer and tends to slow down the other apps if you are not careful.
So in reality most people working with real time visuals just get used to preproducing clips and importing them into their favourite VJ app. And in general you just have one visual app open when you perform live. But why should it be like this? Why can’t you e.g. have one specialized app for producing a nice 3D background and bring that into your favourite VJ app to play with? Or why can’t you bring the output from your VJ app into another specialized app that puts some final touches onto the image before sending it to the screen?
Syphon seems to bring the answer for this. It is a sort of bridging technology – Mac only – that allows applications to ‘talk to each other’ and exchange graphics in real time without any delay or loss of image quality. It only works internally in one machine at this point (I think – Syphon people correct me if I’m wrong) but it still brings a lot of potential for those who perform with visuals live.
One example of this is the MadMapper software. This is currently also in development and has not been released yet. This software allows you to use the output of any Syphon capable app as material for a mapping installation. The MadMapper allows you to split up the image into areas that can be projected onto an object. In this way you can produce your images live in another app of your choice and then let MadMapper handle the final alignment of the image onto the object.
I hope you can share my excitement here. This is just one example of why we should ‘open up the cage’ and set the visuals free. If Syphon catches on I think everybody is gong to benefit from this ability to exchange visuals between applications. My bet is that we will see more specialized applications for live visuals instead of giant VJ apps that try to include every possible feature. This is a good thing in my opinion as I think the overall quality and the number of options available will benefit from this.