Generative Notation and Score with Tone.js

The concept: Notation System for Generative Music

For a long time, I try to find the bridge between generative music, which usually rely on synthesized music, and acoustic (or amplified) music playing and composition.

It felt that starting with creating a notation system for generative music could be the right approach.

MIDI

The effect that the invention of the MIDI protocol had on the evolution of generative (synthesized) music is well known and far beyond the scope of this blog post.

Since the MIDI protocol includes information about the notes that should be played and the way these notes should be played, it served as an inspiration behind my notation system for generative music.

My Generative Music Notation System

The notation system includes two main components:

Legend

The legend is sheet, that defines all possible notes (aka “objects”) that could be played during the generative score. Each note includes the following properties:

  • Color – A unique hex color code that will be used for visualization purposes to represent the note.
  • Frequency – The frequency of the note. Can be presented in Hz (e.g. 440) or letters (e.g. A#3).
  • Amplitude – Volume / velocity in range between 1 (loudest) and 0.
  • Duration – The duration of the note, including an envelope if needed.
  • Loops – The number of time the note should be repeated in a row on the score.
  • Connected_notes – This is the main difference from the MIDI protocol. The connected_notes property will hold a list of notes that should be played with or after this note. Each item on the list, which refer to a connected note, should include the following properties:
    • Color/index number of the connected note according to the legend.
    • The time on which the connected note should be initiated, including maximum and minimum values for silence after the initial timestamp (e.g. if the connected note should be played after the original note, the time will be <the_original_note_duration>+<silence_if_any>).
    • Probability value that will represent the chances that the connected note will be played. All connected notes probability values together should not exceed the value of 1 (== 100% chances).
Generative Music Notation: Legend
Generative Music Notation: Legend
Generative Music Notation: Potential Score
Generative Music Notation: Potential Score

What’s Missing?

Two major properties are missing from the note objects:

  • Instrument (or timbre) – The note object is a set of instructions that could be applied by any instrument. Since I believe that the process of generating music will include the usage of computers (or digital devices), the score can be played with a variety of instruments. The decision about the sound of the piece will be left by the hands of the performer.
  • Timing – Again, since the note object is a set of instructions, these instructions can be initiated and applied anytime during the score, by the performer or by the score itself. The decisions about the timing will also remain by the hands of the performer. The only timed notes are the connected notes which hold instruction that should specify if the note will be initiated with the original note, after the original note, during the original note, etc.

Example

For example, if we will use the legend above and will start the score with the first two notes (7D82B8 & B7E3CC), we will get the following result –

Demo

Using Tone.js, I was able to experiment with generating music based on the legend and score shown above.

The project can be seen here – http://www.projects.drorayalon.com/flickering/.

The current limitations of this demo are:

  • No instrumentation: All notes are being played using the same instrument
  • No dynamics: One of the most likable elements of a music performance is the dynamics and tensions the performer creates while playing the piece. The current implementation doesn’t support any dynamics :\
  • No probability: The current implementation presents a linear and predictable score. Notes have only 1 connected note, and no code was written to support the probability factor that will utilize the notation system to its maximum potential and will make this generative music more interesting (in my opinion).
  • Low-tech visualization: The notation system I described above set up the foundation for a readable visual representation of the score. This visual representation has not been implemented yet.

Some Code. Why Not

This is the code I’m using to run the demo shown above –

//-----------------------------
// play / stop procedures
//-----------------------------
var playState = false;

$("body").click(function() {
  if (playState === false) {
    play();
  } else {
    stop();
  }
});

function play(){
  playState = true;
  $("#click").html("i told you. it is now flickering really badclick anywhere to stop");
  console.log('playing...');
  Tone.Transport.schedule(function(time){
  	noteArray[0].trigger(time);
  }, 0.1);
  Tone.Transport.schedule(function(time){
  	noteArray[1].trigger(time);
  }, 0.4);

  // Tone.Transport.loopEnd = '1m';
  // Tone.Transport.loop = true;

  Tone.Transport.start('+0.1');
  setTimeout(backColorSignal, 100);
}

function stop(){
  playState = false;
  $("#click").html("it is probably still flicker really bad, but it will stop eventuallyclick anywhere to keep it going");
  console.log('stopping...!');
  console.log(Tone.Transport.seconds);
  Tone.Transport.stop();
  Tone.Transport.cancel(0);
}

//-----------------------------
// creating an array of note objects (noteArray)
//-----------------------------

// array of manually added notes
var noteArray = [];

// note constructor
function noteObject(index, color, frequency, amplitude, duration, loops, connected_notes_arry) {
  this.index = index;
  this.color = color;
  this.frequency = frequency;
  this.amplitude = amplitude;
  this.duration = duration;
  this.loops = loops;
  this.connected_notes = connected_notes_arry;
  this.trigger = function(time, index=this.index, frequency=this.frequency, duration=this.duration, connected=this.connected_notes){
    // console.log('time: ' + time);
    // console.log('index: ' + index);
    console.log('');
    console.log('------------');
    console.log('it is ' + Tone.Transport.seconds);
    console.log('playing: ' + index);
    console.log('frequency: ' + frequency);
    console.log('duration: ' + duration);

  	synthArray[index].triggerAttackRelease(frequency, duration, time);

    if (connected !== null) {
      var nextIndex = connected[0];
      var nextTime = 0.01 + Tone.Transport.seconds + connected[1] + parseFloat((Math.random() * (connected[2] - connected[3]) + connected[3]).toFixed(4));
      console.log('generated: ' + nextIndex);
      console.log('at: ' + nextTime);
      Tone.Transport.schedule(function(time){
        noteArray[nextIndex].trigger(time);
      }, nextTime);
    }
  };
}

// starting notes
noteArray.push(new noteObject(0, '7D82B8', 'c3', 1, 1.520*5, 0, [2,1.520*5,0.020*5,0.020*5,0.9]));
noteArray.push(new noteObject(1, 'B7E3CC', 'e2', 1, 6.880*5, 0, null));

// the rest of the notes
noteArray.push(new noteObject(2, 'C4FFB2', 'b2', 1, 1.680*5, 0, [3,1.520*5,0.40,0.80,1]));
noteArray.push(new noteObject(3, 'D6F7A3', 'c#2', 1, 3.640*5, 0, [4,0,0.8,1,1]));
noteArray.push(new noteObject(4, 'ADD66D', 'b2', 1, 0.650*10, 0, [5,0.650*10,0.2,0.2,1]));
noteArray.push(new noteObject(5, 'A4FF7B', 'a2', 1, 1.800*5, 0, [6,0,0,0,1]));
noteArray.push(new noteObject(6, '7BFFD2', 'f#2', 0.2, 1.800*5, 0, [0, 1.800*5, 1, 2, 1]));


//-----------------------------
// creating an array of synth objects (synthArray), based on note objects (noteArray)
//-----------------------------

var synthArray = [];

for (var i=0;i<noteArray.length;i++){
  options = {
    vibratoAmount:1,
    vibratoRate:5,
    harmonicity:4,
    voice0:{
      volume:-30,
      portamento:0,
      oscillator:{
        type:"sine"
      },
      filterEnvelope:{
        attack:0.01,
        decay:0,
        sustain:0.5,
        release:1,
      },
      envelope:{
        attack:0.1,
        decay:0,
        sustain:0.5,
        release:1,
      },
    },
  voice1:{
    volume:-30,
    portamento:0,
    oscillator:{
      type:"sine"
    },
    filterEnvelope:{
      attack:0.01,
      decay:0,
      sustain:1,
      release:0.5,
    },
    envelope:{
      attack:0.01,
      decay:0,
      sustain:0.5,
      release:1,
    }
  }
  };
  synthArray.push(new Tone.DuoSynth(options).toMaster());
}

//-----------------------------
// low-tech visualization
//-----------------------------
b = new Tone.Meter ("signal");
synthArray[1].connect(b);
// synthArray[2].connect(b);

function backColorSignal(){
  if (b.value === 0){
    setTimeout(backColorBlue, 100);
  } else {
    var color = "rgba(0, 0, 255," + b.value + ")";
    $("html").css("background-color", color);
    setTimeout(backColorSignal, 100);
    // console.log('b.value: ' + b.value + " " + color);
  }
}

function backColorBlue(){
  var color = "rgba(0, 0, 255,1)";
  $("html").css("background-color", color);
  setTimeout(backColorSignal, 100);
}

 

NOMNOM 2: The Video Machine – The Programming Behind the Project

Credit: This project was developed together with Mint. Thank you :))

For my ICM final, I worked on an improved version of my mid-term pcomp project.

This time the computational challenges were even greater.
Here is the outcome after long weeks of intensive coding –

NomNom: The Video Machine

NOMNOM’s github repository can be found here – https://github.com/dodiku/the_video_machine_v2

Synching the videos

As a conclusion from the mid-term project, we wanted to give users that ability to play cohesive music. In order to that, we knew that we have to find a way to make sure that all the videos are being played in sync (automatically).

There are many ways to make sure the media is being played synchronously, but none of them deal with videos. To workaround that, we repurposed 2 functions from the p5.js sound library — Phrase and Part.
We used these functions to handle our playback as a loop that includes bars. We can call any callback function at any point on the loop, and therefore, we can actually use them to time our play and stop functions (and many others), based on the user action.


/*********************************************
SETUP FUNCTION (P5.JS)
*********************************************/
function setup() {
  noCanvas();

  // setting up serial communication
  serial = new p5.SerialPort();
  serial.on('connected', serverConnected);
  serial.on('open', portOpen);
  serial.on('data', serialEvent);
  serial.on('error', serialError);
  serial.list();
  serial.open(portName);

  // creating a new 'part' object (http://p5js.org/reference/#/p5.Part)
  allVideosPart = new p5.Part();
  allVideosPart.setBPM(56.5);

  // adding general phrase (http://p5js.org/reference/#/p5.Phrase) to the 'part'
  var generalSequence = [1,0,0,0, 0,0,0,0, 1,0,0,0, 0,0,0,0, 1,0,0,0, 0,0,0,0, 1,0,0,0, 0,0,0,0];
  generalPhrase = new p5.Phrase('general', countSteps, generalSequence);
  allVideosPart.addPhrase(generalPhrase);

  for (var i = 0; i<16; i++){
    allVideosPart.addPhrase(new p5.Phrase(i, videoSteps, [0,0,0,0, 0,0,0,0, 0,0,0,0, 0,0,0,0, 0,0,0,0, 0,0,0,0, 0,0,0,0, 0,0,0,0]));
  }

  // console.log(allVideosPart);
  allVideosPart.loop();

}

We initiate the Part, a Phrase per video, and a general Phrase that will be used as a clock, on the setup function.

The ‘countSteps’ callback function is being used to store the current step on a global variable, and the ‘videoSteps’ callback function is being used to play and stop video at the right time.

First success with the beat-sync feature – 

Improving the UI

We really wanted to make it easier for users to understand what is going on on the screen, and to provide a better sense of control on the videos.

In order to achieve that, we used the NexusUI JS library and added 4 graphical elements, each of which indicates a different property of the video (number of repetitions, volume, speed, and trim), on every video.

The graphical elements are shown to the user only when the video is being played.

Also, we add a grayscale CSS filter on videos that are not being played. This way, it is easier for the user to focus on the videos that are being played and making sounds.

Built to perform

While designing the technical architecture for the project, I faced many limitations, mostly because of the slow nature of the ASCII serial communication protocol. Therefore, I had to develop a very efficient internal communication protocol to compensate for the delay we had when pressing the buttons on the box. That was the only way to achieve fast responding controller, that will change the video states on the screen immediately.

This was the first time I was required to write efficient code (and not just for the fun of it). After 2 weeks of re-writing the code, and reducing few milliseconds every time, I came up with the following lines:

Reading data from controller (Arduino side) –


trellis.readSwitches();
for (uint8_t n = 0; n < numKeys; n++) {
  if (trellis.justPressed(n)) {
   LEDstatus[n] = 3; 

   continue; 
   }
    
    if (LEDstatus[n] == 3) {
        buttonPress[n]++;
        if (blinkTime >= 4) {
          if (trellis.isLED(n)) {
            trellis.clrLED(n);
            trellis.writeDisplay();
            } else {
              trellis.setLED(n);
              trellis.writeDisplay();
            }
        }
      }

    if (trellis.justReleased(n)) {
      if (buttonPress[n] > 8) {
        LEDstatus[n] = 1;
        oldStatus[n] = 1;
        buttonPress[n] = 0;
        trellis.setLED(n);
        trellis.writeDisplay();
      } else {
        buttonPress[n] = 0;
        if (oldStatus[n] == 1) {
          LEDstatus[n] = 0;
          oldStatus[n] = 0;
          trellis.clrLED(n);
          trellis.writeDisplay();
        } else {
          LEDstatus[n] = 1;
          oldStatus[n] = 1;
          trellis.setLED(n);
          trellis.writeDisplay();
        }
      }
    }

Parsing the data on the browser (JavaScript side) – 


/*********************************************
PARSER: PARSE DATA THAT ARRIVES FROM
ARDUINO, AND APPLY CHANGES IF NEEDED
*********************************************/
function parseData(data){

  // parsing the data by ','
  var newStatus = data.split(",");

  // turning strings into integers
  for (var x=0; x CONTINUE
    if ((newStatus[i] !== 3) && (newStatus[i] === videos[i].status)){
      var vidID = i+1;
      vidID = "#video" + vidID;
      $(vidID).css('border-color', "rgba(177,15,46,0)");
      continue;
    }
    else {

      // getting the relevant phrase
      var phraseIndex = i;
      var updatedPhrase = allVideosPart.getPhrase(phraseIndex);

      if (newStatus[i] === 3){

        if (videos[i].originStep === null) {
          videos[i].originStep = currentStep;
        }

        changeColor(i, 1);
        showKnobs(i);

        videos[i].volume = vol;
        videos[i].cut = cut;
        videos[i].speed = speed;
        videos[i].steps = newStatus[16];
        changeKnobs(i);

        // making the video border blink
        var vidID = i+1;
        vidID = "#video" + vidID;
        if (newStatus[20] === 2) {
          if (($(vidID).css('border-color')) === "rgba(177, 15, 46, 0)"){
            $(vidID).css('border-color', "rgba(255,255,255,0.9)");
          }
          else {
            $(vidID).css('border-color', "rgba(177, 15, 46, 0)");
          }
        }


        // clearing the sequence
        for (var n=0; n<32; n++){
          updatedPhrase.sequence[n] = 0;
        }

        // applying steps changes, if any
        var stepNum = videos[i].originStep;
        for (var m=0; m 31) {
            stepNum = stepNum - 32;
          }
        }

      }

      else if (newStatus[i] === 1) {
        videos[i].status = 1;
        changeColor(i, videos[i].status);
        var vidID = i+1;
        vidID = "#video" + vidID;
        $(vidID).css('border-color', "rgba(177,15,46,0)");
      }

      else if (newStatus[i] === 0) {
        videos[i].status = 0;
        hideKnobs(i);
        changeColor(i, videos[i].status);
        var vidID = i+1;
        vidID = "#video" + vidID;
        $(vidID).css('border-color', "rgba(177,15,46,0)");

        // clearing the sequence
        for (var n=0; n<32; n++){
          updatedPhrase.sequence[n] = 0;
        }

        videos[i].originStep = null;

      }
    }
  }
  serial.write(1);
}


When I review this code now, it all seems so simple (LOL!), but this is one of the pieces of code I'm most proud of.

After looong hours of coding, we are very happy we what we achieved 🙂

Controlling video playback features

Overview

For the first time in my ITP history, I was able to combine home assignment for ICM with home assignment for Pcomp.

I created a video manipulation interface, that could be controlled by a physical controller.
The entire project was build with Mint for our Physical Computing class mid-term.

The Video Machine - Web Interface
The Video Machine – Web Interface

 

The Video Machine – Web Interface from Dror Ayalon on Vimeo.

Functionality

I used the JavaScript video API to do the following manipulations on the video playback:

  • Loop – Playing the video in an infinite loop.
  • Volume – Changing the volume of the sound.
  • Cut – Trimming the length of the video.
  • Speed – Changing the speed of the video playback.
The Video Machine
The Video Machine

Code

Here is the JavaScript code I used for this project –