October 2nd, 2009

I’m feeling productive tonight but don’t quite want to take on any coding projects, so I thought I would write up some thoughts that have been running through my head on performance, VJing, and user interface.

I had a short notice gig the other night and was planning on using VDMX since its my usual go to tool. I’m a heavy user of its Quartz Composer integration (who would have guessed) and had noted there seems to be a weird issue where color fidelity/accuracy is lossy when using a bunch of QC effects in a row. Something happens internally based on how things are rendered and the net result and the color shifts and lifts. This usually is not an issue for me at all, but this made some of the things I was trying to do for this performance difficult (or, rather, very ugly and artifacty) and I ended up realizing Quartz Composer natively does not suffer from this. The point is I ended up using a custom version of an application I helped co-develop for my lady friend Outpt, which she named Orbitr. The point is Orbitr and VDMX pretty much sum up two diametrically opposed approaches to VJing and performance.

Let me show you 2 images. The top is the VDMX setup I had planned to use. The bottom is my custom build of Orbitr that I used.

I think the difference will be self evident, at least philosophically. Some differences technically:

  • VDMX can handle as many layers/channels as your system can handle. Orbitr has 2 channels. Hard coded.
  • VDMX has multiple, programmable and customizable blend modes for mixing, Orbitr has 4 blend modes. Hard coded.
  • VDMX can handle as many effects per channel as your system can handle. Orbitr has 8 non re-arrangable serialized effects. Hard coded.
  • VDMX can listen to audio, OSC, Midi, HID, Keyboard input. Orbitr is listens to a single an Xbox controller. Controls are hard coded.
  • VDMX exposes multiple controls per effect. Orbitr pretty much only lets you toggle effects on or off. (Seriously) In my custom build I changed it so there was two global effects parameters (one per channel) than adjusted effects. One. Per Channel. For all effects. Simultaneously. Hard coded.
  • VDMX lets you load banks of, trigger and preview arbitrary numbers of Quicktime movies and other types of sources. Orbitr lets you load a folder per channel and go to the next/previous movie. Thats it. No clip preview or triggering grid.
  • VDMX is so flexible it allows for pretty much any method of controlling, sequencing and playing the app. Orbitr only one, a hand tuned control scheme using the Xbox controller that you can memorize in minutes and never have to look at the screen, or even think twice about. No LFOs to configure, no tap tempo, no layer controls, almost no nothing.

I dont quite recall the exact genesis of Orbitr, but somehow I convinced Outpt that learning Quartz Composer would be beneficial for her and her work. She started performing using custom software written in Processing, and moved on to use 8 bit scene friendly devices like the Game Boy Advance and GP2X – hand held gaming platforms hacked for performing visuals.

Orbitr comes from that background and is designed to work similarly. It is basically a front end for a a single QC Composition Outpt created. It uses all QC based objects and really only uses Cocoa as a front end and rendering driver for full screen and previews. Everything internal is a QC patch, using the QC HID patch to listen to the Xbox Controller, QCs Movie Importer (yea, not even mine!) in QC for playback, etc. It does not even use Display Link to drive the rendering. In short, its pretty trivial coding wise.

In about 4 hours of work I usurped the Orbitr composition and code and replaced many of the hard coded effects with my effects based on my plugins, blurs and specific recipes. I tweaked the UI, and added that one single control param per channel. That was pretty much it. The net result is the most limiting (in many regards) performance environment I have ever used, pretty much since my first few Max/MSP patches 8 years or so ago.

It also was quite liberating, eye opening, and somewhat maddening.

So whats the point? The point is when Outpt gets a gig she concentrates on generating new content, and not worrying too much about setting up her app (there are pretty much infinite combinations in VDMX I get to tend with). She does not worry about setting up effects, (or really even choosing them). She concentrates on providing and creating content and mastering the application. When she is at the gig, she can concentrate the performance interface and actually playing it, not setting, re setting, re configuring, tweaking and fiddling with a complicated UI. She can watch the crowd, the musicians and never the screen.

I realized that I may get to effects oriented, using similar, if not the same footage. I am hooked on realtime effects, where I bend frames, mask them, shred them and manipulate them all right now. Many of Outpts clips are offline renders using processing,photoshop or what not that are played back as a rendered movie. Could you do it in realtime using QC? Yes. Why? Well… there might be valid reasons, but thats just another thing to think about while you should be playing. In other words, all those realtime effects means more controls, means you have to think about them, reign them in, make sure everything looks good and fits compositionally, color wise, as well as matching the mood and working artistically. All while the music and mood around you changes. You cut down your reaction time when you juggle more, and thats bad. It also means less chances of improvisational playing, and tending toward presets that you know will work, making your performance less in the moment that it might otherwise be.

A lot of this is not just about simplification, or even limitation, but getting back to the point of actually performing the visuals. It’s about elegant approaches that make you work smarter and get better results. Im convinced Orbitr does a lot of this really well, and is an approach really worth thinking hard about, especially in designing new workflows, interfaces and applications like I have been trying to do with v002.app. It makes me at least pause and consider if VDMX is in all of its flexibility, ironically very limiting. Now, don’t get me wrong, Orbitr is limiting on purpose, and could definitely use some additional features, but at what point do you end up making VDMX all over again and start stressing about the trees when you should be worries about the forest?

Anyway, this was just a ramble while things are fresh in my head. For many artists, musicians and VJs this has been obvious for a while, but this is just a different approach that is rather new to me first hand, and being so limiting and different, struck a chord. I really don’t know what Ill end up designing or using, but its certainly interesting, challenging and somewhat scary to use an app like that. Thanks Outpt <3!

On a side note I am pretty convinced now that game controllers are the best thing since sliced bread for Vjing. You dont have to put them down, they are easy to master and work with muscle memory so you can stand up tall and pay attention to the environment, while looking around you and reacting rapidly. The plethora of various controls and combinations offers a lot of flexibility in setting up a sensible controls scheme for how you want to work. You never have to hunt for a key, midi knob or wacom pen/pad (or track pad/cursor).

Electric Zoo

September 6th, 2009

Joshua Goldberg got me hooked on VJing when showed me his app Dervish and gave me source code to its Max/MSP/Jitter patch back in 2001 or so – before I knew anything deep about Jitter at all. Dervish was a real eye opener for many reasons, but most importantly If it was not for him, I would not be doing anything remotely like what I am doing today with my art & programming. So it was a great compliment that he asked me to play along side him for Electric Zoo in NYC this labor day.

Here are some screen shots of my output during our 4 hour or so set together with Josh playing along with Robbie Rivera, Roger Sanchez and Benny Benassi. Was an extraordinary amount of fun. Like 36,000 people fun. Video and photos coming soon.


August 30th, 2009

While fixing bugs in v002 plugins and experimenting with Quartz Composer 4.0, stumbled on to this. I think its worth exploring some new aesthetic directions.

CRT emulation

July 18th, 2009

These images are not taken from an image of a CRT screen, but are synthesized entirely within Quartz Composer. Ive been working on a CRT emulator for Open Emu to help make playing older console games more of a true experience to the original. The plugin emulates proper scan lines (not just drawing a mask over everything, but proper illumination, scan lines are not as visible over bright pixels since the light bleeds over the scan line) and ‘sub pixel’ phosphor patterns on the CRT tube. Since Open Emu uses Quartz Composer, we can feed anything to the plugins. Experimenting with the v002 glitch plugins yielded the results below:

Works in progress…

July 2nd, 2009

Ive been busy lately with many projects. The v002 application which has been hinted at is coming along very nicely thanks to help from ‘bangnoise’, aka Tom Butterworth. Tom wrote the lovely ‘real’ datamosh plugin for Quartz Composer, and has been amazingly helpful in many regards. I had the pleasure of meeting up with Tom in London. Lovely guy, and I am really happy to have his help.

I also have some newer plugins for QC as well as doing some graphics pipeline improvements to Open Emu. During this ive been posting small updates via twitter, if you care, you can find me @_vade on twitter, or email me at vade [ at ] vade [ dot ] info if you are interested in beta testing things.

Here is a taste of an Open Emu upcoming feature, and stay tuned for some updates to the v002 Movie Player which has some nice improvements and bug fixes thanks to Tom as well. Heres some retarded fun with Open Emus new Quartz Composer filtering pipeline…

Open Emu – New Filters/Scaler system from vade on Vimeo.

Ongoing OpenEmu development has seen a lot of changes recently. One of the exciting new features is leveraging Quartz Composer for post processing frames from the various console emulators using OpenGL. These filters allow high quality scaling using GLSL and whatnot. But a lot more fun can be had with the Quartz Composer backend…

This filter aims to emulate what gaming is like on an older CRT monitor.. literally. We borrow from Stellas new CRT emulation code, and build on it by adding iSight powered reflections in the CRT, and distortion to match the curvature of the tube.

This is just a relatively simply example of what can be done using Quartz Composer as a filter system, there are a lot more opportunities to have fun and adjust the output image to suite hard core tweakers and experimenters :)


Just a sneek peak at an currently un-optimized filter/functionality.


June 3rd, 2009

Some images and video from LPM 2009. In short, awesome.

LPM 2009 VJ Jam Excerpt 1 of 3 – View in HD

LPM 2009 VJ Jam Excerpt 2 of 3

LPM 2009 VJ Jam Excerpt 3 of 3

Live Performers Meeting

May 24th, 2009

Outpt and myself will be performing at Live Performers Meeting 2009, and I will be dropping in on the Quartz Composer workshop, to talk about some advanced topics, (programming plugins, including Quartz Composer into a custom application), and potentially with some surprises.

If you will be at LPM, let me know.

Live Performers Meeting 2009

Orange 2.0

May 1st, 2009

More Urban Imagery:

Experiment with QC Typography

April 27th, 2009

QC Typography from vade on Vimeo.

v002 Movie Player FFT analysis

April 11th, 2009

v002 Movie Player with FFT Audio Analysis Demo from vade on Vimeo.

A user suggested I add audio analysis to the movie player plugin. Turns out Apple has some pretty easy APIs to add FFT analysis to your QT movie playback code, so I implemented it tonight. Its a bit rough, but this is nice because it grabs only audio from the currently playing movie (meaning you could, in theory have 2 of these going and get different FFTs out), and runs based off of volume in the patch, so you can fade out your audio and the FFT respects that. Kind of neat!

vade [at] vade [dot] info