I’m feeling productive tonight but don’t quite want to take on any coding projects, so I thought I would write up some thoughts that have been running through my head on performance, VJing, and user interface.
I had a short notice gig the other night and was planning on using VDMX since its my usual go to tool. I’m a heavy user of its Quartz Composer integration (who would have guessed) and had noted there seems to be a weird issue where color fidelity/accuracy is lossy when using a bunch of QC effects in a row. Something happens internally based on how things are rendered and the net result and the color shifts and lifts. This usually is not an issue for me at all, but this made some of the things I was trying to do for this performance difficult (or, rather, very ugly and artifacty) and I ended up realizing Quartz Composer natively does not suffer from this. The point is I ended up using a custom version of an application I helped co-develop for my lady friend Outpt, which she named Orbitr. The point is Orbitr and VDMX pretty much sum up two diametrically opposed approaches to VJing and performance.
Let me show you 2 images. The top is the VDMX setup I had planned to use. The bottom is my custom build of Orbitr that I used.
I think the difference will be self evident, at least philosophically. Some differences technically:
- VDMX can handle as many layers/channels as your system can handle. Orbitr has 2 channels. Hard coded.
- VDMX has multiple, programmable and customizable blend modes for mixing, Orbitr has 4 blend modes. Hard coded.
- VDMX can handle as many effects per channel as your system can handle. Orbitr has 8 non re-arrangable serialized effects. Hard coded.
- VDMX can listen to audio, OSC, Midi, HID, Keyboard input. Orbitr is listens to a single an Xbox controller. Controls are hard coded.
- VDMX exposes multiple controls per effect. Orbitr pretty much only lets you toggle effects on or off. (Seriously) In my custom build I changed it so there was two global effects parameters (one per channel) than adjusted effects. One. Per Channel. For all effects. Simultaneously. Hard coded.
- VDMX lets you load banks of, trigger and preview arbitrary numbers of Quicktime movies and other types of sources. Orbitr lets you load a folder per channel and go to the next/previous movie. Thats it. No clip preview or triggering grid.
- VDMX is so flexible it allows for pretty much any method of controlling, sequencing and playing the app. Orbitr only one, a hand tuned control scheme using the Xbox controller that you can memorize in minutes and never have to look at the screen, or even think twice about. No LFOs to configure, no tap tempo, no layer controls, almost no nothing.
I dont quite recall the exact genesis of Orbitr, but somehow I convinced Outpt that learning Quartz Composer would be beneficial for her and her work. She started performing using custom software written in Processing, and moved on to use 8 bit scene friendly devices like the Game Boy Advance and GP2X – hand held gaming platforms hacked for performing visuals.
Orbitr comes from that background and is designed to work similarly. It is basically a front end for a a single QC Composition Outpt created. It uses all QC based objects and really only uses Cocoa as a front end and rendering driver for full screen and previews. Everything internal is a QC patch, using the QC HID patch to listen to the Xbox Controller, QCs Movie Importer (yea, not even mine!) in QC for playback, etc. It does not even use Display Link to drive the rendering. In short, its pretty trivial coding wise.
In about 4 hours of work I usurped the Orbitr composition and code and replaced many of the hard coded effects with my effects based on my plugins, blurs and specific recipes. I tweaked the UI, and added that one single control param per channel. That was pretty much it. The net result is the most limiting (in many regards) performance environment I have ever used, pretty much since my first few Max/MSP patches 8 years or so ago.
It also was quite liberating, eye opening, and somewhat maddening.
So whats the point? The point is when Outpt gets a gig she concentrates on generating new content, and not worrying too much about setting up her app (there are pretty much infinite combinations in VDMX I get to tend with). She does not worry about setting up effects, (or really even choosing them). She concentrates on providing and creating content and mastering the application. When she is at the gig, she can concentrate the performance interface and actually playing it, not setting, re setting, re configuring, tweaking and fiddling with a complicated UI. She can watch the crowd, the musicians and never the screen.
I realized that I may get to effects oriented, using similar, if not the same footage. I am hooked on realtime effects, where I bend frames, mask them, shred them and manipulate them all right now. Many of Outpts clips are offline renders using processing,photoshop or what not that are played back as a rendered movie. Could you do it in realtime using QC? Yes. Why? Well… there might be valid reasons, but thats just another thing to think about while you should be playing. In other words, all those realtime effects means more controls, means you have to think about them, reign them in, make sure everything looks good and fits compositionally, color wise, as well as matching the mood and working artistically. All while the music and mood around you changes. You cut down your reaction time when you juggle more, and thats bad. It also means less chances of improvisational playing, and tending toward presets that you know will work, making your performance less in the moment that it might otherwise be.
A lot of this is not just about simplification, or even limitation, but getting back to the point of actually performing the visuals. It’s about elegant approaches that make you work smarter and get better results. Im convinced Orbitr does a lot of this really well, and is an approach really worth thinking hard about, especially in designing new workflows, interfaces and applications like I have been trying to do with v002.app. It makes me at least pause and consider if VDMX is in all of its flexibility, ironically very limiting. Now, don’t get me wrong, Orbitr is limiting on purpose, and could definitely use some additional features, but at what point do you end up making VDMX all over again and start stressing about the trees when you should be worries about the forest?
Anyway, this was just a ramble while things are fresh in my head. For many artists, musicians and VJs this has been obvious for a while, but this is just a different approach that is rather new to me first hand, and being so limiting and different, struck a chord. I really don’t know what Ill end up designing or using, but its certainly interesting, challenging and somewhat scary to use an app like that. Thanks Outpt <3!
On a side note I am pretty convinced now that game controllers are the best thing since sliced bread for Vjing. You dont have to put them down, they are easy to master and work with muscle memory so you can stand up tall and pay attention to the environment, while looking around you and reacting rapidly. The plethora of various controls and combinations offers a lot of flexibility in setting up a sensible controls scheme for how you want to work. You never have to hunt for a key, midi knob or wacom pen/pad (or track pad/cursor).