On-the-fly Codepoetry

Added on by Dario Villanueva.

This Sunday is my first show with my algorithmic music box. I'll be playing at the Wharf Chambers in Leeds, on a show that goes from 17 to 21 hours: 

Organised by yèct, the show's idea explores the relationships between code, poetry and their audio visual expression:

The first output is the ephemeral act that involves the recital - session (...), but also by the relationship of the development of our doing, we want to establish a transition between the process and the object (*), which is also our interest to publish a tape with the resulting sounds from that session, as well as complement the same with the visual results of our recital, this means algorithmic images and the codepoetry of each set.

 Can Code be poetry? Can poetry be code? can code bridge the gap between poetry and audio?

An incredible lineup includes Yaxu (Alex McLean) whom i'm very excited to meet. I can't wait.

The Waves - Generative Music using Node.js

Added on by Dario Villanueva.

My dear friend Himali Singh Soin recently moved to London. She is currently working on a project that highlights the abundant use of semicolons on Virgina Woolf's "The Waves". On her visual work, she explores wave/particle duality as expressed on literature.

In her video, she uses a metronome to navigate through all the semicolons on the work.

In her video, she uses a metronome to navigate through all the semicolons on the work.

She reached out to me to see if I had any ideas on how we could put a soundtrack to her video. I had the idea to map the distances between semicolons into notes on a scale. One of the problems with generative music is that it usually doesn't sound that musical at all. So, I decided to constrain myself to a scale so that the work would sound as musical as possible, while still having a generative/data driven basis.

The first step was to analyse the base material. Thankfully, a text file of the work is available form project Gutemberg which means that processing the text was very easy. I created a node.js parser that would essentially split the work into a list of passages between semicolons and semicolons. Then I took the size of these passages and created a JSON file with some metadata.

Once that was created, I moved onto creating a MIDI instrument with Node.js. I have experience creating music with Ableton Live, but I can't afford Max for Live. Furthermore I'm not that comfortable with GUI programming languages. So, i created a rudimentary midi controller on node.js, using Justin Latimer's node-midi package. It creates a midi input and output, and is controlled by Ableton, so you can play, stop, etc. Unfortunately there isn't information on the BPM set on Ableton on the MIDI messages, so that was a bit hard to work with - I might want to use OSC for that. Anyway.

The C-Dorian scale. It sounds real nice.

The C-Dorian scale. It sounds real nice.

When you press play on Ableton, the script commences iterating through the segments on the JSON file, and maps the number of words of the segment onto a note in the scale:

// Octaves generally go from -2 to 8
var octave = ((line.numberOfWords / scale.length | 0) % 10) - 2;
// scale is an array of note names
// notes is a map of note names to midi codes (on scale 0)
var noteName = scale[line.numberOfWords % scale.length];
// there are 12 semitones between octaves, so the octave transpose
// is 12
var note = (notes[noteName] + (octave * 12));

Each note is then triggered as a noteOn and noteOff messages, with a variable duration between these. Ableton then receives the MIDI message and uses it to drive an instrument. This produces quite a musical output, ranging from bassy notes to high-pitched accents. 

The output of the MIDI controller, with the semicolons mapped to notes and midi codes. Note the high variety of pitches

The output of the MIDI controller, with the semicolons mapped to notes and midi codes. Note the high variety of pitches

The video was also edited in Ableton, for some tempo fixes and warping to allow for the mechanicity of the node script. You can watch the video here. I recommend you wear headphones.

I've also uploaded just the audio in case you want to listen to it that way:

Queuing Simulator

Added on by Dario Villanueva.

If you got here you've successfully queued for a long time (or found the link at the bottom of the page). I'm sad to say that the queue was however a fabrication - there was no real "queue", just a digital barrier between you and the content of this website.

Since it's #CyberMonday, I decided to create a queueing simulator to reproduce the experience found at Curry's website:

I hope you have enjoyed the rush of emotions that thousands have felt throughout this weekend, not only the bouts of compulsive consumerism and herd behaviour that were experienced on the temples of capitalism, but also authentic feelings on the digital realm.

You can queue all you want here

The Landscape Of Things

Added on by Dario Villanueva.

The internet of things is becoming part of our commonplace landscape. In this series of collages, I'm exploring a world in which connected devices have conquered our lives, and become part of our landscape.