One of the reasons you’ve been seeing posts from me lately about the graphic arts software programs Inkscape and Processing is because I’m in the planning stages of a multi-movement, electroacoustic, multi-media work that I will write for a flute quartet based in RÄ«ga (and possibly a second group in Göteborg). In any case, I chose as my inspirational starting point the subject of Emergence, the study of how complexity arises in various kinds of systems.
I’ve gotten a hold of various books on subtopics of the subject, such as Steven Johnson’s Mind Wide Open: Your Brain and the Neuroscience of Everyday Life, Scientific American’s collection of articles, Understanding Artificial Intelligence, James Surowiecki’s The Wisdom of Crowds, which I first heard about when listening to the podcast of one of my favorite radio programs, the one for WNYC’s Radiolab.
One of the movements I’m planning will involve projection of an animated, graphic ’score’ that will be realized/performed by the audience in real-time, accompanied by electronics and the flute quartet. I’ve put myself on the learning curves of both Inkscape and Processing in order to prepare those scores. I’ll talk about my plans for that in another post.
Along the lines of artificial intelligence, I thought I’d try to survey what’s happening with computer assisted (or generated) composition currently, whether algorithmic or not. If I could define the kind of activity going on in this regard right now, I’d break it down into two categories, each with sub-categories: those that require knowing or learning code (such as LISP, see for example, Peter Siebel’s Practical Common LISP, also available at Amazon) and those that are principally driven through a GUI (Graphical User Interface). The subcategories for each of those are FLOSS or FOSS (Free/Libre Open Source Software) vs. Commercial.
I want to talk about my experiences, early impressions, difficulties, or whatever else comes up:
1.) because it will help me process my own thoughts;
2.) if I overcome some technical hurdle (and boy, do they seem to have a way of persistently appearing) I might as well share my solution to save the next poor soul some time, and;
3.) to the extent that it’s offered, receive the wisdom and/or expertise of anyone who comes upon what I’m writing and wants to share.
So that brings me to Tiction, a quite beautiful, freeware “nodal music sequencer,” created by Hans Kuder with Processing. I downloaded the program, and followed the brief instructions at the website. Tiction doesn’t generate sound on its own, so needs to be connected to an external MIDI keyboard or an internal software synth.
There are basically three menus in Tiction:
1.) The Help menu, which is basically a list of keyboard shortcuts for setting up the nodal network, N to create a node, C to connect it to the next one, etc. It’s very straightforward.
2.) The Options menu, which allows you to choose 16 specific pitches according to their corresponding MIDI note number, with a default setting of a C major scale/diatonic collection, the MIDI In/Out connections, sync parameters, the ‘bar brightness’ and ‘do physical actions on trigger’
3.) The Edit menu (reached by selecting a node and typing E, which allows you to select specific parameters for the highlighted node, including MIDI channel, physical actions (such as jiggle, attract, repel), and velocity, among other things.
I first connected it to my external MIDI keyboard via my typical Core Audio MIDI Setup in Mac OS X, selecting it from the Options menu. I created several nodes, connected them, and fired it up. Right away, Tiction made some interesting music, with compelling visuals to go with it. The default behavior dictates that the network of nodes you’ve created drift around the screen, and depending where the network is drifting along the X/Y axis, it will affect the register that is sounded as well as affect the velocity. What that means is that the default mode is really rather musical. Set certain nodes to attract or repel, and the activity on the screen and the music generated become more agitated. Change the pitch collection and its potential broadens again.
I was so excited, I began thinking that it would be great to look into Screencasting software so that I could make a video of Tiction doing its thing and project it for the audience. I would record MIDI into say, three or four MIDI channels in Logic, add, edit, or modify material as I saw fit, and voilÃ¡! One movement done! Since there will be a choreographer and some dancers as part of the project, I thought this would make a perfect accompaniment.
I then wanted to try running Tiction through Apple’s Logic, and here I wound up hitting several hurdles, some that were solvable and some that I haven’t been able to yet. First, running Tiction into Logic requires using the IAC (Inter-Application Communication) Bus that comes by default with Audio MIDI Setup in OS X. At first it didn’t work. I tried it with Midipipe. Still no. Since Tiction was made with Processing and since Processing requires conversion into Java, AND since, evidently, there is some lack of support from Apple with Java, I thought the problem may reside within the Java extensions folder. Looking through the (not particularly current) message board at the Tiction website, I decided to by Mandolane MIDI SPI, thinking it was a long-shot, but since it was cheap, well, okay, and it was. A long shot, that is. Still no. But on the right track. Turns out the only extension necessary is: mmj (because since OS X 10.4.8 Apple no longer supports some java MIDI packages). Download mmj and copy both mmj.jar and libmmj.jnilib into /Library/Java/Extensions.
Finally! I get Logic and Tiction talking to each other. But another head (or 3) grew on the hydra:
1.) I can’t set the nodes to play on different MIDI channels. Whenever I hit “E” and edit the MIDI channel number, no matter what number I enter, it always resets itself to channel 1 as soon as I hit “E” again to exit the editor.
2.) I’m having the same “note off” issues that others reported in earlier versions of the software.
3.) I can record MIDI data into Logic from Tiction, but I can’t get their metronomes to sync up. If I select anything other than “Use Internal Clock” in Tiction, it refuses to play for me.
So, it’s not yet necessarily at the deal-breaker stage for me. Though it would be some work, I could still realign the MIDI data to proper bars and beats to deal with the sync issue. (I don’t know if there’s some clock drift over time or not that might make that more complicated than I think). I could re-orchestrate the MIDI data to whatever channels I want after the fact, though that would be time-consuming, and probably less organic than being able to do it directly from the original. I suppose I could make the MIDI ‘note off’ problem a feature rather than a problem, especially if I choose to involve the flute quartet in some interesting, crunchy way against the held tones. (I could also manually shorten other groups of notes that didn’t turn off.
I posted this issue on the Tiction website. If I get an answer that solves it, I’ll report back. Otherwise, anybody out there already run into and solve this problem?
September 21, update: Problem #1 is solved, with help from Hans Kuder. When changing the MIDI channel in the individual node’s Edit menu, you must use the ENTER key for the change to take effect. The other half of the issue, on the Logic side, is that it is necessary to go to File>Project Settings>Recording and check “Auto Demix by Channel if Multitrack Recording.”
Note Off and Sync issues remain.