I’m feeling productive tonight but don’t quite want to take on any coding projects, so I thought I would write up some thoughts that have been running through my head on performance, VJing, and user interface.

I had a short notice gig the other night and was planning on using VDMX since its my usual go to tool. I’m a heavy user of its Quartz Composer integration (who would have guessed) and had noted there seems to be a weird issue where color fidelity/accuracy is lossy when using a bunch of QC effects in a row. Something happens internally based on how things are rendered and the net result and the color shifts and lifts. This usually is not an issue for me at all, but this made some of the things I was trying to do for this performance difficult (or, rather, very ugly and artifacty) and I ended up realizing Quartz Composer natively does not suffer from this. The point is I ended up using a custom version of an application I helped co-develop for my lady friend Outpt, which she named Orbitr. The point is Orbitr and VDMX pretty much sum up two diametrically opposed approaches to VJing and performance.

Let me show you 2 images. The top is the VDMX setup I had planned to use. The bottom is my custom build of Orbitr that I used.

I think the difference will be self evident, at least philosophically. Some differences technically:

  • VDMX can handle as many layers/channels as your system can handle. Orbitr has 2 channels. Hard coded.
  • VDMX has multiple, programmable and customizable blend modes for mixing, Orbitr has 4 blend modes. Hard coded.
  • VDMX can handle as many effects per channel as your system can handle. Orbitr has 8 non re-arrangable serialized effects. Hard coded.
  • VDMX can listen to audio, OSC, Midi, HID, Keyboard input. Orbitr is listens to a single an Xbox controller. Controls are hard coded.
  • VDMX exposes multiple controls per effect. Orbitr pretty much only lets you toggle effects on or off. (Seriously) In my custom build I changed it so there was two global effects parameters (one per channel) than adjusted effects. One. Per Channel. For all effects. Simultaneously. Hard coded.
  • VDMX lets you load banks of, trigger and preview arbitrary numbers of Quicktime movies and other types of sources. Orbitr lets you load a folder per channel and go to the next/previous movie. Thats it. No clip preview or triggering grid.
  • VDMX is so flexible it allows for pretty much any method of controlling, sequencing and playing the app. Orbitr only one, a hand tuned control scheme using the Xbox controller that you can memorize in minutes and never have to look at the screen, or even think twice about. No LFOs to configure, no tap tempo, no layer controls, almost no nothing.

I dont quite recall the exact genesis of Orbitr, but somehow I convinced Outpt that learning Quartz Composer would be beneficial for her and her work. She started performing using custom software written in Processing, and moved on to use 8 bit scene friendly devices like the Game Boy Advance and GP2X – hand held gaming platforms hacked for performing visuals.

Orbitr comes from that background and is designed to work similarly. It is basically a front end for a a single QC Composition Outpt created. It uses all QC based objects and really only uses Cocoa as a front end and rendering driver for full screen and previews. Everything internal is a QC patch, using the QC HID patch to listen to the Xbox Controller, QCs Movie Importer (yea, not even mine!) in QC for playback, etc. It does not even use Display Link to drive the rendering. In short, its pretty trivial coding wise.

In about 4 hours of work I usurped the Orbitr composition and code and replaced many of the hard coded effects with my effects based on my plugins, blurs and specific recipes. I tweaked the UI, and added that one single control param per channel. That was pretty much it. The net result is the most limiting (in many regards) performance environment I have ever used, pretty much since my first few Max/MSP patches 8 years or so ago.

It also was quite liberating, eye opening, and somewhat maddening.

So whats the point? The point is when Outpt gets a gig she concentrates on generating new content, and not worrying too much about setting up her app (there are pretty much infinite combinations in VDMX I get to tend with). She does not worry about setting up effects, (or really even choosing them). She concentrates on providing and creating content and mastering the application. When she is at the gig, she can concentrate the performance interface and actually playing it, not setting, re setting, re configuring, tweaking and fiddling with a complicated UI. She can watch the crowd, the musicians and never the screen.

I realized that I may get to effects oriented, using similar, if not the same footage. I am hooked on realtime effects, where I bend frames, mask them, shred them and manipulate them all right now. Many of Outpts clips are offline renders using processing,photoshop or what not that are played back as a rendered movie. Could you do it in realtime using QC? Yes. Why? Well… there might be valid reasons, but thats just another thing to think about while you should be playing. In other words, all those realtime effects means more controls, means you have to think about them, reign them in, make sure everything looks good and fits compositionally, color wise, as well as matching the mood and working artistically. All while the music and mood around you changes. You cut down your reaction time when you juggle more, and thats bad. It also means less chances of improvisational playing, and tending toward presets that you know will work, making your performance less in the moment that it might otherwise be.

A lot of this is not just about simplification, or even limitation, but getting back to the point of actually performing the visuals. It’s about elegant approaches that make you work smarter and get better results. Im convinced Orbitr does a lot of this really well, and is an approach really worth thinking hard about, especially in designing new workflows, interfaces and applications like I have been trying to do with v002.app. It makes me at least pause and consider if VDMX is in all of its flexibility, ironically very limiting. Now, don’t get me wrong, Orbitr is limiting on purpose, and could definitely use some additional features, but at what point do you end up making VDMX all over again and start stressing about the trees when you should be worries about the forest?

Anyway, this was just a ramble while things are fresh in my head. For many artists, musicians and VJs this has been obvious for a while, but this is just a different approach that is rather new to me first hand, and being so limiting and different, struck a chord. I really don’t know what Ill end up designing or using, but its certainly interesting, challenging and somewhat scary to use an app like that. Thanks Outpt <3!

On a side note I am pretty convinced now that game controllers are the best thing since sliced bread for Vjing. You dont have to put them down, they are easy to master and work with muscle memory so you can stand up tall and pay attention to the environment, while looking around you and reacting rapidly. The plethora of various controls and combinations offers a lot of flexibility in setting up a sensible controls scheme for how you want to work. You never have to hunt for a key, midi knob or wacom pen/pad (or track pad/cursor).

10 Responses to “Ramblings”

  1. outpt Says:

    I really like the fact that when I first started working on it, I had already played with a few different apps (VDMX, Pikix, SYSTM) and decided what I liked/hated. And by played I mean performed — proper stress-testing, not just saying “oh this looks cool.” I had a really short wish list of features, some of which still aren’t in Orbitr. And all of the additions since the first version have come from “oh shit, I desperately need ___” rather than just adding superfluous stuff. But the controller also prevents us from going into feature-bloat, because there are only so many button combinations possible before it becomes ridiculous. VJ philosophy: One must be able to keep playing for at least 30 seconds using one hand (while holding a drink with the other). Bonus points for mixing a drink with other hand.

    Minor correction: “Orbitr pretty much only lets you toggle effects on or off.”
    I have *one* whole effect that can be modified (colorize has 8 hue options). Which is an exciting freedom over Pikix’s 3 tints!

    I was thinking today to write up a page for Oribitr on my site, will eventually get around to it. But the genesis was basically this.
    I had settled into mostly using Pikix for shows but I disliked:
    * very limited file size
    * crash-happy app necessary to make proprietary compressed files
    * didn’t like some of the effects
    * couldn’t remap controls, so I was stuck with some that felt unintuitive to me
    * awkward handling of two channels
    * no preview
    All that complaining and that was the one I *liked*! So I decided something new had to be done. I realized, quite reluctantly, that QC should be involved. I think that was a mere two months after being in Rome and not paying attention in the workshop Shakinda and you did, because I *knew* I would never use QC. Hah! And then I broke my arm and spent my time in the waiting room building the effects pipeline. Ta da!

  2. outpt Says:

    More thoughts 🙂

    You actually played just with Orbitr, which is something I wouldn’t have thought to do. I’m used to thinking of it as being one of three inputs going into the mixer, so there’s a bit more to work with than just the the app. If all else fails, there’s always rocking out like it’s 1965 with feedback. But it was very nice to see it stand on its own.

  3. Tim Keeling Says:

    hi _vade i’ve been following your accomplishments with great interest over the last months. As a new VJ, I have been looking through the VJ network to see what’s happening, who is pushing the medium, and found your website.

    Your thoughts on limitation reminded me of “The Paradox of Choice: Why More Is Less” essay by Barry Schwartz. where he argues that the consumer is paralysed by choice.

    great article

  4. scott Says:

    a reply is nothing without a cliche, so: less is more (sorry)
    what outpt is doing is reminiscent of a presentation i saw of monolake and his use of Live- having one continuously evolving set, with changes in content. outpts & monolakes use of their tools seems to me to open more possibilities than having a gazillion options/effects/layers, and the reduction of materials and processes has been a tried and true method of artistic practice since ochre and a cave wall- with the technology available its just easier to chuck in another effect, route in another source, chuck on another effect while letting the intention from the performer get lost in the gazillion options. but what about the viewer? does the saturation or overuse of the technology muddy the experience? is it more interesting for someone to see an amazing show built out of reduced kit, a few select pieces, or tables of gear with each piece barely used? australian farmers have a saying- bigger the hat, smaller the property hehe.

    the monolake presentation is here: http://createdigitalmusic.com/2008/03/21/abletons-robert-henke-and-why-sometimes-less-bitrate-is-more/

  5. vade Says:

    Hi Tim, Scott.

    Tim, I had read that essay a while back but had forgotten about it. I bet I was subconsciously channelling it. Thanks 🙂

    Scott, I tend to agree in general. I think I was perhaps over-stating my general methodology to make a point, but I think a lot of tools can make performers spend more time twiddling and expending creative energy solving technical problems than creative ones. I know that as a programmer and a performer I tend to attack problems technically, and have gotten somewhat frustrated in that I feel like my work has not ‘leveled up’ creatively. I think Orbitr was a realization on my part that I was fighting battles and not the war so to speak. Content, content content gets you there, not just fancy tools.

    Ill check out the monolake presentation later. Thanks for the feedback.

  6. bangnoise Says:

    Hey, really enjoyed this. In a way by diving into Cocoa you’re working with so many *more* options than VDMX gives you, it’s just that you’re forced to have a finished product that’s pretty much a sealed box before you perform. I think that approach is totally the way to go with VDMX or whatever – build the simplest possible setup, then treat it as a finished tool. I really like the way music kit works with simple boxes that plug into each other, each performing one function. I’d love to work with video through a bunch of little steel boxes with just a couple of buttons each, sat on the floor in a tangle of wires.

    As for games controllers, YES! So cheap too.

  7. toby*spark Says:

    yes, yes and no. what works with orbitr is that there is an obvious mapping between engine and interface, and if you don’t quite get it right the iterative design cycle is quick. that breaks down somewhat for pieces that require a sophisticated engine, where its a) less obvious what controls you’re going to need at the design stage and b) there’s much more time investment to realise what you need.

    to give a concrete example, orbitr is single screen, full frame. particle is triplehead combining single, double, and triple width sources. so you’ve got collage as well as montage. the easiest (only?) way to rapid prototype a performance setup to meet the ideas of particle was vdmx+qc, and now i have something that works the time investment to orbitr-ise it could easily fall into hubris rather than need. having said that, vade of all people knows how much i want to do that, and to reinforce the point of the post, the process of doing that would be enormously beneficial from a creative standpoint.

  8. scott Says:

    just something that could add to this from mr Eno:

  9. vade Says:

    Scott, lovely addition. Thanks 🙂

  10. Morganb Says:

    Nice write up Vade. Thanks for sharing a bit of your process. Really nice to see different approaches to these tools and how different artists think about content creation vs software/patch/setup design. I’m really in agreement about the game controller thing I’ve been using a Logitech game pad at VJ gigs and it’s great to stand back and not hunch over some controllers. It’s interesting because the audience responds differently, initially they think the VJ is just playing Play Station, but can then start to see some of the connections being made. very effective to start to open up the VJ/Visualist and audience dialogue. You can just pass around a wireless controller when you want to take a break.
    There’s a whole other discussion on VJ ergonomics, but I’ll leave it at that.


Leave a Reply