Archive for September, 2009

ActivityMonitorYesterday I upgraded the OS on my MBP to Snow (Slow) Leopard Mac OS X 10.6.1, and had a problem that seems to have been pervasive among early (earlier) upgraders: the system’s performance slowed to a pathetic crawl. I did the usual first steps: (using Disk Utility) repaired permissions and verified the hard disk and rebooted. Nothing. Reset the PRAM. No improvement. Fired up the Console and hunted down several files that were causing problems for various (and as it turns out, unrelated) reasons. One of those worth mentioning, however, is Pro Tools. I’ve got Pro Tools LE 7 and am not using it much any more, as I’ve opted for Logic Studio 9 as my DAW, but am still miffed to realize that Digidesign has no plans to provide a Snow Leopard update for LE 7, forcing all users to upgrade to Pro Tools 8. That’s a little sleazy.


Anyway, I fired up the Activity Monitor (found in ~user/Applications/Utilities) and saw that _coreaudiod was using from 2.5 to 3 GB of the 4 GB of available RAM (NOT normal), which explains the extremely sluggish behavior. (Here’s what my activity monitor looks like now.) Rather than bore you with the details of the dozen things I tried along the way to finding the solution (for me, anyway), I’ll get right to the two solutions that seem to have solved the problem for most people who had the same Core Audio issue (as I imagine many of you mac users here will also have).

Solution One. Manually delete or type into Terminal: sudo mv /System/Library/LaunchDaemons/com.apple.audio.coreaudiod.plist ~/

Solution Two. (Worked for me.) Delete the folder: System/Library/Preferences/Audio.

There were no side-effects, and the OS is indeed faster than before (or it seems that way, at least after several snail-hours).

Comments No Comments »

NodalScreenSnapz001Continuing my look at generative music software, here’s Nodal, also free (but not for much longer – it’s currently on a time-limit that expires on October 30, at which point an upgrade will be released, for a reasonable $25) that, like Tiction, generates MIDI pitches derived from user-placed (and defined) nodes. Where Tiction uses the distance between connected nodes (indicated by a 1, 2, 3, and so on, on the connector’s midpoint) to determine how many tics (of the MIDI clock) will pass between soundings of adjacent nodes, Nodal uses a graph to accomplish this.

Tiction allows you to determine the “physical” behavior (jiggle, attract, repel, plus another feature called “weight”) and pitch behavior of each node. The node can draw its pitch from a set of 16 pre-defined, user-selectable pitches, where it makes its selection according to where it’s drifting on the X/Y axis, OR you can assign that node a single, fixed pitch from inside or outside the set of 16.

With Nodal, on the other hand, the nodes are immobile. That doesn’t mean you are forced to live with the graph – I took the screenshot I took so you can see one of the examples that comes with the program (along with several other provocative examples), a beautiful miniature that is definitely free of the grid, called Nebula. Nested beside Nebula in the screenshot is the result of my following the tutorial. With Nodal, there’s more flexibility and possibility with pitch choices: each node can be assigned its own pitch list or something relational to another node using +/- n. Just for the record, here’s a short MP3 example of the result of the tutorial, not bad for 6 little nodes:
Nodal Tutorial

Despite their similarities, and while Tiction has the attractive feature of being hypnotically beautiful and is something that you can fire up and make music in minutes with ((my own technical synchronization/MIDI note-off problems with Tiction aside – I also couldn’t get Tiction to sync with Reason 4.0), I had no such problems with Nodal, as Nodal comes with a dedicated port built in, making (I think) IAC unnecessary). Nodal ultimately presents itself as something else, another way of composing. One of the downloadable PDFs at the Nodal website reads a little like a manifesto at times (but hey, then call me товарищ (comrade)):

“The goal of any Arti cial Life (AL) or generative composition system should be to off er possibilities and results unattainable with other methods. A number of authors have suggested that the emergence of novel and appropriate macro behaviours and phenomena – arising through the interaction of micro components speci fied in the system – is the key to achieving this goal. While simple emergence has been demonstrated in a number of AL systems, in the case of musical composition, many systems restrict the ability to control and direct the structure of the composition, conceding instead to the establishment of emergence as the primary goal.

This focus on emergence exclusively, while interesting in terms of the emergent phenomena themselves, has been at the expense of more useful software systems for composition itself. The aim of the work described in this paper is to design and build a generative composition tool that exhibits complex emergent behaviour, but at the same time offers the composer the ability to structure and control processes in a compositional sense. The idea being that the composer works intuitively in a synergetic relationship with the software, achieved through a unique visual mapping between process construction and compositional representation.

Comments No Comments »

One of the reasons you’ve been seeing posts from me lately about the graphic arts software programs Inkscape and Processing is because I’m in the planning stages of a multi-movement, electroacoustic, multi-media work that I will write for a flute quartet based in RÄ«ga (and possibly a second group in Göteborg). In any case, I chose as my inspirational starting point the subject of Emergence, the study of how complexity arises in various kinds of systems.

I’ve gotten a hold of various books on subtopics of the subject, such as Steven Johnson’s Mind Wide Open: Your Brain and the Neuroscience of Everyday Life, Scientific American’s collection of articles, Understanding Artificial Intelligence, James Surowiecki’s The Wisdom of Crowds, which I first heard about when listening to the podcast of one of my favorite radio programs, the one for WNYC’s Radiolab.

One of the movements I’m planning will involve projection of an animated, graphic ’score’ that will be realized/performed by the audience in real-time, accompanied by electronics and the flute quartet. I’ve put myself on the learning curves of both Inkscape and Processing in order to prepare those scores. I’ll talk about my plans for that in another post.

Along the lines of artificial intelligence, I thought I’d try to survey what’s happening with computer assisted (or generated) composition currently, whether algorithmic or not. If I could define the kind of activity going on in this regard right now, I’d break it down into two categories, each with sub-categories: those that require knowing or learning code (such as LISP, see for example, Peter Siebel’s Practical Common LISP, also available at Amazon) and those that are principally driven through a GUI (Graphical User Interface). The subcategories for each of those are FLOSS or FOSS (Free/Libre Open Source Software) vs. Commercial.

I want to talk about my experiences, early impressions, difficulties, or whatever else comes up:
1.) because it will help me process my own thoughts;
2.) if I overcome some technical hurdle (and boy, do they seem to have a way of persistently appearing) I might as well share my solution to save the next poor soul some time, and;
3.) to the extent that it’s offered, receive the wisdom and/or expertise of anyone who comes upon what I’m writing and wants to share.

tiction_topSo that brings me to Tiction, a quite beautiful, freeware “nodal music sequencer,” created by Hans Kuder with Processing. I downloaded the program, and followed the brief instructions at the website. Tiction doesn’t generate sound on its own, so needs to be connected to an external MIDI keyboard or an internal software synth.

There are basically three menus in Tiction:
1.) The Help menu, which is basically a list of keyboard shortcuts for setting up the nodal network, N to create a node, C to connect it to the next one, etc. It’s very straightforward.
2.) The Options menu, which allows you to choose 16 specific pitches according to their corresponding MIDI note number, with a default setting of a C major scale/diatonic collection, the MIDI In/Out connections, sync parameters, the ‘bar brightness’ and ‘do physical actions on trigger’
3.) The Edit menu (reached by selecting a node and typing E, which allows you to select specific parameters for the highlighted node, including MIDI channel, physical actions (such as jiggle, attract, repel), and velocity, among other things.

I first connected it to my external MIDI keyboard via my typical Core Audio MIDI Setup in Mac OS X, selecting it from the Options menu. I created several nodes, connected them, and fired it up. Right away, Tiction made some interesting music, with compelling visuals to go with it. The default behavior dictates that the network of nodes you’ve created drift around the screen, and depending where the network is drifting along the X/Y axis, it will affect the register that is sounded as well as affect the velocity. What that means is that the default mode is really rather musical. Set certain nodes to attract or repel, and the activity on the screen and the music generated become more agitated. Change the pitch collection and its potential broadens again.

I was so excited, I began thinking that it would be great to look into Screencasting software so that I could make a video of Tiction doing its thing and project it for the audience. I would record MIDI into say, three or four MIDI channels in Logic, add, edit, or modify material as I saw fit, and voilá! One movement done! Since there will be a choreographer and some dancers as part of the project, I thought this would make a perfect accompaniment.

Picture 1 I then wanted to try running Tiction through Apple’s Logic, and here I wound up hitting several hurdles, some that were solvable and some that I haven’t been able to yet. First, running Tiction into Logic requires using the IAC (Inter-Application Communication) Bus that comes by default with Audio MIDI Setup in OS X. At first it didn’t work. I tried it with Midipipe. Still no. Since Tiction was made with Processing and since Processing requires conversion into Java, AND since, evidently, there is some lack of support from Apple with Java, I thought the problem may reside within the Java extensions folder. Looking through the (not particularly current) message board at the Tiction website, I decided to by Mandolane MIDI SPI, thinking it was a long-shot, but since it was cheap, well, okay, and it was. A long shot, that is. Still no. But on the right track. Turns out the only extension necessary is: mmj (because since OS X 10.4.8 Apple no longer supports some java MIDI packages). Download mmj and copy both mmj.jar and libmmj.jnilib into /Library/Java/Extensions.

Finally! I get Logic and Tiction talking to each other. But another head (or 3) grew on the hydra:

1.) I can’t set the nodes to play on different MIDI channels. Whenever I hit “E” and edit the MIDI channel number, no matter what number I enter, it always resets itself to channel 1 as soon as I hit “E” again to exit the editor.
2.) I’m having the same “note off” issues that others reported in earlier versions of the software.
3.) I can record MIDI data into Logic from Tiction, but I can’t get their metronomes to sync up. If I select anything other than “Use Internal Clock” in Tiction, it refuses to play for me.

So, it’s not yet necessarily at the deal-breaker stage for me. Though it would be some work, I could still realign the MIDI data to proper bars and beats to deal with the sync issue. (I don’t know if there’s some clock drift over time or not that might make that more complicated than I think). I could re-orchestrate the MIDI data to whatever channels I want after the fact, though that would be time-consuming, and probably less organic than being able to do it directly from the original. I suppose I could make the MIDI ‘note off’ problem a feature rather than a problem, especially if I choose to involve the flute quartet in some interesting, crunchy way against the held tones. (I could also manually shorten other groups of notes that didn’t turn off.

I posted this issue on the Tiction website. If I get an answer that solves it, I’ll report back. Otherwise, anybody out there already run into and solve this problem?
*********
September 21, update: Problem #1 is solved, with help from Hans Kuder. When changing the MIDI channel in the individual node’s Edit menu, you must use the ENTER key for the change to take effect. The other half of the issue, on the Logic side, is that it is necessary to go to File>Project Settings>Recording and check “Auto Demix by Channel if Multitrack Recording.”
Note Off and Sync issues remain.

Comments 2 Comments »

I can’t figure out yet how to make this flying robot loop, or better, to let the viewer click the mouse or a key on the keyboard to restart his flight.

This one, where you move the mouse moves the robot, and also changes his eye and tele-belly color. Also, if you push a key on your keyboard (for me, I have to push it down and hold it down for a little bit), he has a message for you.

Comments No Comments »

So, moving on from my robot to mouse interactivity, I thought I’d post my riff on this example from the book I’m using, only to discover that loading Processing “sketches” into Wordpress blogs is a difficult affair. Luckily, as usual, one of my many betters has solved the problem and posted his solution. Morten Skogly’s answer to the problem is here. I chose some ugly colors for this, but anyway, if this works in your browser on your computer, I’d love to know about it. Thanks! Oh, and if it’s not obvious, drag your mouse across the green square…

Comments No Comments »

I recently ordered Daniel Shiffman’s Learning Processing: A Beginner’s Guide to Programming Images, Animation, and Interaction for my Kindle. Completely different from using Inkscape, Processing is code-based. One of the early projects in the book is to design a creature/character entirely of rectangles, lines, ellipses, etc., and to figure out how to write the code to bring that creature to life, so to speak. My robot is no match for the Borg or Cyberdyne’s T808 series, alas. The code involved seemed counterintuitive at first, but I got faster with it as time went on. Here’s the robot:
My Processing Robot

And here’s the code:

// My robot

size(400,400);
background(255);
smooth();
ellipseMode(CENTER);
rectMode(CENTER);

// Body
stroke(0);
fill(150);
rect(200,200,75,100);

// Head
fill(155);
rect(200,100,60,60);

// Neck
fill(160);
ellipse(200,140,25,20);

// Eyes
fill(255);
ellipse(185,100,16,25);
ellipse(215,100,16,25);
fill(0);
ellipse(185,105,10,12);
ellipse(215,105,10,12);

//Mouth
line(190,120,210,120);

// Antennae
stroke(0);
line(190,70,180,50);
line(220,50,210,70);

//TV box belly
fill(255);
rect(200,205,35,65);

//Feet
fill(160);
rect(180,263,25,25);
rect(220,263,25,25);
ellipse(180,288,21,21);
ellipse(220,288,21,21);
rect(180,312,25,25);
rect(220,312,25,25);
ellipse(178,330,40,15);
ellipse(222,330,40,15);

//Arms
fill(255);
ellipse(160,165,21,21);
ellipse(240,165,21,21);
fill(160);
rect(165,165,25,25);
rect(235,165,25,25);

Comments No Comments »

The Swedish and EU flags were made from the first two official Inkscape tutorials.

I gave myself the Andy Warhol treatment following this tutorial.

The chrome letter effect was made by following a video tutorial over at youtube.

Comments No Comments »

I’m new to Inkscape and have been trying a small handful of tutorials I’ve found online. In the middle of a compass-making tutorial, while trying to use “Path to Pattern Effect”, I ran into the following wall:

The fantastic lxml wrapper for libxml2 is required by inkex.py and therefore this extension. Please download and install the latest version from http://cheeseshop.python.org/pypi/lxml/, or install it through your package manager by a command like: sudo apt-get install python-lxml

Confession: I missed the Open Source boat, more or less. I have no taste for code. I consider myself fluent on my Mac, but Java, Python, Terminal, it’s all, well, not exactly Greek to me, but it is rather foreign. At the same time, I see that more and more people are obtaining a certain fluency in these languages, and I also understand that there’s considerable overlap from one to another, so that what one learns of a particular code language is like a dialect of another. One advantage of being an American expat in this regard is that I’ve become adept at listening to a conversation in a foreign language (in my case, Latvian) and picking up the gist of it even though I may lose the details.

As I said, I’m new to Inkscape. I’m working on a multi-movement, electroacoustic multimedia piece, and have decided that for one of the movements one of the things I want to try is use Inkscape and Processing to create an animated graphic score to be projected before the audience that the musicians will also read.

I spent most of the day working on fixing this problem. And there were plenty of forum posts about this issue, but the only one I found that seems to have resolved it suggested an upgrade to the development version of Inkscape (0.47), but that alone did not do the trick for me. So here’s the sum total of everything I did that DID work for me, plus how I would do it now, knowing what I do know.

Inkscape. First of all, looking at the contents of the Inkscape application (Control-Click on Inkscape.app/Show Package Contents/Contents/Resources), I could see that the extensions Inkscape was looking for were not present. I upgraded to Inkscape development ver. 0.47. The necessary extensions were not there either. Strike one.

Python. Maybe since the extension is related to Python (or at least that what I thought), I need to upgrade Python? Upgraded to Python 3.1. No change. Clearly, I don’t know what I’m doing.

X11. At some point, somewhere, I read that I should upgrade X11 (necessary for Inkscape to run on a Mac) to X-Quartz 2.4.0. No change. Sigh. Hit the desk.

Macports. Somewhere along the line, I stumble upon Macports. “The MacPorts Project is an open-source community initiative to design an easy-to-use system for compiling, installing, and upgrading either command-line, X11 or Aqua based open-source software on the Mac OS X operating system. To that end we provide the command-line driven MacPorts software package under a BSD License, and through it easy access to thousands of ports that greatly simplify the task of compiling and installing open-source software on your Mac.” Sounds good to me! I install version 1.8, and follow the guide for the first three chapters. I poke around, am able to look at ports, find Python lxml and libxml2 ports but am getting an error (I forget now what it was, but I googled it at the time and didn’t feel particularly enlightened) when I try to install. Back to the guide. By Chapter 4 I feel lost.

Porticus. Again, somewhere along the line I discover Porticus, “a Cocoa GUI (Graphical User Interface) for the MacPorts package manager. MacPorts provides ready to build open-source software packages modified to compile and run on Mac OS X. The MacPorts project provides a TCL command line tool to manage installation, update and activation of the port packages. Porticus provides a GUI front-end to this tool.” Now we’re talking! No code! I try it, find the find Python lxml and libxml2 ports but am still getting an error. For some reason, because I was given the error not in Terminal but in Porticus, I don’t feel so stupid.

Back to Macports.org, this time to the FAQ page, because I can’t believe what they’re calling a guide over there is guiding anyone. There, I read: “You need to install Xcode. Ensure you include both X11SDK and Unix Development. Some ports need newer versions of Xcode than that which ships with the OS, and will fail to install due to that requirement. Xcode is not updated via Software Update, you have to download it manually. To do so, go to http://connect.apple.com/ and log in with your ADC information (the free online account is enough to get access to Xcode). Once you log in, go to Downloads, then select Developer Tools on the right section under Downloads. You can then search for Xcode (there are quite a few versions available, make sure to get the latest for your OS version).”

Sure enough, I do a search on my Mac, and don’t find X11SDK. I head over to Apple Development and download and install Xcode 3.1. If you do the basic install, you get the Unix Development package automatically. I manually installed X11SDK, which is part of the same .dmg file.

I go back to Porticus and try installing the ports again. Voilà! 12 ports show up (the necessary ports and their dependencies). I fire up Inkscape again, and finally, FINALLY! No error message! It worked!

So, if I had to do it again, I would start by going directly to the ADC site and get Xcode. Download Macports if you don’t have it. I realized after the fact that if I had followed the instructions here, I would not have needed Porticus.

This all may become quite moot if you upgrade to Snow Leopard. I don’t know yet. My Snow Leopard disc is in the mail. But if you do, Xcode 3.2 is already at the ADC site for Snow Leopard. In any case, enjoy the stupid compass.

Comments No Comments »

This is not a new debate among us composers, not at all, and like the differences between Apple and PC users, the typical user of either Sibelius or Finale has rarely wanted to bother jumping on the other learning curve after having mastered the program they use. But this has been a year of stepping on learning curves for me.

I’ve been a Finale user for a LONG time, since grad school back in Minnesota. After some failed experiments with SCORE (anyone remember that?) in the early 1990s on an IBM 386 (or those?) I drifted over to Apple and to Finale, and began using Finale for my own work and as a a semi-pro copyist while still working on my thesis (around 1994). There is, by the way, a freeware notation program that, at first glance, seems reminiscent of score, called Lilypond. Anyway, I’ve gone through many incarnations of Finale. Or perhaps I should say only a few. There have been a woefully small handful of significant upgrades in the decade since they started doing annual upgrades of their software (the introduction of Smart Shapes comes to mind), and I would typically let a year or more go by in between upgrades before purchasing one myself. The most recent break was between Finale 2006 and now Finale 2010. For which I completely regret dropping $200. Not to mention because of the added shipping cost to Latvia (which is a completely different but regularly maddening story). In the four years that passed by, they managed to move several menu items or tools to unfamiliar places, and beyond that, I fail to see much difference between them. There are a handful of minor improvements, to be sure, as with the Rehearsal Markings, for example. But nothing that made me have that “cool!” moment.

In fact, one day when I spent quite some time hunting down some tool that had been moved, in frustration I went to Sibelius’ website, and there I had that “cool!” moment. I was impressed with two things in particular: self-adjusting graphic elements and integration with Rewire. I purchased the competitive upgrade of Sibelius 6 for Finale users and await my copy of the software and extra manual as I write (I had to have it shipped to my Dad in North Carolina who will in turn ship it to me).

In Finale, one spends a great deal of time simply moving stuff around. Actually, control over the look of crescendos and decrescendos had been better in earlier versions of Finale. But the incorporation of Rewire into Sibelius was really the deciding factor.

One of the things I had been avoiding for quite some time was engagement with technology. I used Finale, and that was it. The big advances in digital audio were just ramping up as I was just leaving grad school, and most of the work I did in the Analog and Digital sequence at the U of Mn quickly became dated anyway.

But during the past couple of years I have been working up my familiarity with several programs (some with more success than others). So much so that I recently attempted, for the first time, an electroacoustic piece. I used Apple’s Logic 8 Express and Propellerhead’s Reason 4.0 to make the electronic score, and used Finale (without Rewire) to create a notated trumpet part.

The process of composing the trumpet part would have been quite simplified if Finale had Rewire, since Reason already slaves easily to Logic. But as it stood, I would 1.) write some of the trumpet part, notate it in Finale, 2.) save the Finale file as a MIDI file, 3.) import the MIDI file into Reason, 4.) assign a sound to the imported MIDI data, in this case a Combinator trumpet patch, 5.) check how the trumpet part gibed with everything else, 6.) delete the MIDI data in Reason, 7.) make changes to the trumpet part in Finale, and go back to step #2.

Since my next commission is for another electroacoustic piece, this time with flute quartet, I decided to use all that time I would spend navigating around Finale’s shortcomings to learning Sibelius instead.

Comments 6 Comments »

I haven’t written a proper blog entry in quite some time, for several reasons.

I changed my website to a blog format back in 2005/6 so I could relate my experiences as an American expat living in Latvia, and for about a year or so I was able to write about my experiences here with a kind of virtual (pun intended) anonymity, partially shielded by my writing in English and partially shielded by my outsider status here in Latvia. I wasn’t a known commodity, and was thus below many local radars. My audience (such as it is) was American. Or at least in my mind it was. But over time, I realized that sure enough, people here in Latvia were reading my commentaries. And those that discovered it and could read English were of course willing to translate for others if it seemed appropriate. I said nothing particularly scathing (well, maybe a little, but never personal), but I was still trying to write honestly. As time passed and I’d had good professional experiences as well as bad ones, I began to feel constrained against writing about the warts.

Here’s an SAT style analogy for you: The New Music community in New York is to archipelago as The New Music community in Latvia is to Melrose Place.

Reason number two: Since I moved here, I got remarried and had two kids. Nothing takes time away from things that feel even a little bit peripheral than that.

Anyway, I say all that to say this: I have a big project involving lots of tech stuff in the works, where I’d like to open up the process a little bit and also benefit from the experience of others who may care to share their advice or guidance. I’ll leave the actual beginning of that topic to the the next entry. Until then.

Comments 1 Comment »