My dear friend Himali Singh Soin recently moved to London. She is currently working on a project that highlights the abundant use of semicolons on Virgina Woolf's "The Waves". On her visual work, she explores wave/particle duality as expressed on literature.
She reached out to me to see if I had any ideas on how we could put a soundtrack to her video. I had the idea to map the distances between semicolons into notes on a scale. One of the problems with generative music is that it usually doesn't sound that musical at all. So, I decided to constrain myself to a scale so that the work would sound as musical as possible, while still having a generative/data driven basis.
The first step was to analyse the base material. Thankfully, a text file of the work is available form project Gutemberg which means that processing the text was very easy. I created a node.js parser that would essentially split the work into a list of passages between semicolons and semicolons. Then I took the size of these passages and created a JSON file with some metadata.
Once that was created, I moved onto creating a MIDI instrument with Node.js. I have experience creating music with Ableton Live, but I can't afford Max for Live. Furthermore I'm not that comfortable with GUI programming languages. So, i created a rudimentary midi controller on node.js, using Justin Latimer's node-midi package. It creates a midi input and output, and is controlled by Ableton, so you can play, stop, etc. Unfortunately there isn't information on the BPM set on Ableton on the MIDI messages, so that was a bit hard to work with - I might want to use OSC for that. Anyway.
When you press play on Ableton, the script commences iterating through the segments on the JSON file, and maps the number of words of the segment onto a note in the scale:
// Octaves generally go from -2 to 8 var octave = ((line.numberOfWords / scale.length | 0) % 10) - 2; // scale is an array of note names // notes is a map of note names to midi codes (on scale 0) var noteName = scale[line.numberOfWords % scale.length]; // there are 12 semitones between octaves, so the octave transpose // is 12 var note = (notes[noteName] + (octave * 12));
Each note is then triggered as a noteOn and noteOff messages, with a variable duration between these. Ableton then receives the MIDI message and uses it to drive an instrument. This produces quite a musical output, ranging from bassy notes to high-pitched accents.
The video was also edited in Ableton, for some tempo fixes and warping to allow for the mechanicity of the node script. You can watch the video here. I recommend you wear headphones.
I've also uploaded just the audio in case you want to listen to it that way: