Generative Music Notation: Potential Score

Generative Notation and Score with Tone.js

The concept: Notation System for Generative Music

For a long time, I try to find the bridge between generative music, which usually rely on synthesized music, and acoustic (or amplified) music playing and composition.

It felt that starting with creating a notation system for generative music could be the right approach.

MIDI

The effect that the invention of the MIDI protocol had on the evolution of generative (synthesized) music is well known and far beyond the scope of this blog post.

Since the MIDI protocol includes information about the notes that should be played and the way these notes should be played, it served as an inspiration behind my notation system for generative music.

My Generative Music Notation System

The notation system includes two main components:

Legend

The legend is sheet, that defines all possible notes (aka “objects”) that could be played during the generative score. Each note includes the following properties:

  • Color – A unique hex color code that will be used for visualization purposes to represent the note.
  • Frequency – The frequency of the note. Can be presented in Hz (e.g. 440) or letters (e.g. A#3).
  • Amplitude – Volume / velocity in range between 1 (loudest) and 0.
  • Duration – The duration of the note, including an envelope if needed.
  • Loops – The number of time the note should be repeated in a row on the score.
  • Connected_notes – This is the main difference from the MIDI protocol. The connected_notes property will hold a list of notes that should be played with or after this note. Each item on the list, which refer to a connected note, should include the following properties:
    • Color/index number of the connected note according to the legend.
    • The time on which the connected note should be initiated, including maximum and minimum values for silence after the initial timestamp (e.g. if the connected note should be played after the original note, the time will be <the_original_note_duration>+<silence_if_any>).
    • Probability value that will represent the chances that the connected note will be played. All connected notes probability values together should not exceed the value of 1 (== 100% chances).
Generative Music Notation: Legend
Generative Music Notation: Legend
Generative Music Notation: Potential Score
Generative Music Notation: Potential Score

What’s Missing?

Two major properties are missing from the note objects:

  • Instrument (or timbre) – The note object is a set of instructions that could be applied by any instrument. Since I believe that the process of generating music will include the usage of computers (or digital devices), the score can be played with a variety of instruments. The decision about the sound of the piece will be left by the hands of the performer.
  • Timing – Again, since the note object is a set of instructions, these instructions can be initiated and applied anytime during the score, by the performer or by the score itself. The decisions about the timing will also remain by the hands of the performer. The only timed notes are the connected notes which hold instruction that should specify if the note will be initiated with the original note, after the original note, during the original note, etc.

Example

For example, if we will use the legend above and will start the score with the first two notes (7D82B8 & B7E3CC), we will get the following result –

Demo

Using Tone.js, I was able to experiment with generating music based on the legend and score shown above.

The project can be seen here – http://www.projects.drorayalon.com/flickering/.

The current limitations of this demo are:

  • No instrumentation: All notes are being played using the same instrument
  • No dynamics: One of the most likable elements of a music performance is the dynamics and tensions the performer creates while playing the piece. The current implementation doesn’t support any dynamics :\
  • No probability: The current implementation presents a linear and predictable score. Notes have only 1 connected note, and no code was written to support the probability factor that will utilize the notation system to its maximum potential and will make this generative music more interesting (in my opinion).
  • Low-tech visualization: The notation system I described above set up the foundation for a readable visual representation of the score. This visual representation has not been implemented yet.

Some Code. Why Not

This is the code I’m using to run the demo shown above –

//-----------------------------
// play / stop procedures
//-----------------------------
var playState = false;

$("body").click(function() {
  if (playState === false) {
    play();
  } else {
    stop();
  }
});

function play(){
  playState = true;
  $("#click").html("i told you. it is now flickering really badclick anywhere to stop");
  console.log('playing...');
  Tone.Transport.schedule(function(time){
  	noteArray[0].trigger(time);
  }, 0.1);
  Tone.Transport.schedule(function(time){
  	noteArray[1].trigger(time);
  }, 0.4);

  // Tone.Transport.loopEnd = '1m';
  // Tone.Transport.loop = true;

  Tone.Transport.start('+0.1');
  setTimeout(backColorSignal, 100);
}

function stop(){
  playState = false;
  $("#click").html("it is probably still flicker really bad, but it will stop eventuallyclick anywhere to keep it going");
  console.log('stopping...!');
  console.log(Tone.Transport.seconds);
  Tone.Transport.stop();
  Tone.Transport.cancel(0);
}

//-----------------------------
// creating an array of note objects (noteArray)
//-----------------------------

// array of manually added notes
var noteArray = [];

// note constructor
function noteObject(index, color, frequency, amplitude, duration, loops, connected_notes_arry) {
  this.index = index;
  this.color = color;
  this.frequency = frequency;
  this.amplitude = amplitude;
  this.duration = duration;
  this.loops = loops;
  this.connected_notes = connected_notes_arry;
  this.trigger = function(time, index=this.index, frequency=this.frequency, duration=this.duration, connected=this.connected_notes){
    // console.log('time: ' + time);
    // console.log('index: ' + index);
    console.log('');
    console.log('------------');
    console.log('it is ' + Tone.Transport.seconds);
    console.log('playing: ' + index);
    console.log('frequency: ' + frequency);
    console.log('duration: ' + duration);

  	synthArray[index].triggerAttackRelease(frequency, duration, time);

    if (connected !== null) {
      var nextIndex = connected[0];
      var nextTime = 0.01 + Tone.Transport.seconds + connected[1] + parseFloat((Math.random() * (connected[2] - connected[3]) + connected[3]).toFixed(4));
      console.log('generated: ' + nextIndex);
      console.log('at: ' + nextTime);
      Tone.Transport.schedule(function(time){
        noteArray[nextIndex].trigger(time);
      }, nextTime);
    }
  };
}

// starting notes
noteArray.push(new noteObject(0, '7D82B8', 'c3', 1, 1.520*5, 0, [2,1.520*5,0.020*5,0.020*5,0.9]));
noteArray.push(new noteObject(1, 'B7E3CC', 'e2', 1, 6.880*5, 0, null));

// the rest of the notes
noteArray.push(new noteObject(2, 'C4FFB2', 'b2', 1, 1.680*5, 0, [3,1.520*5,0.40,0.80,1]));
noteArray.push(new noteObject(3, 'D6F7A3', 'c#2', 1, 3.640*5, 0, [4,0,0.8,1,1]));
noteArray.push(new noteObject(4, 'ADD66D', 'b2', 1, 0.650*10, 0, [5,0.650*10,0.2,0.2,1]));
noteArray.push(new noteObject(5, 'A4FF7B', 'a2', 1, 1.800*5, 0, [6,0,0,0,1]));
noteArray.push(new noteObject(6, '7BFFD2', 'f#2', 0.2, 1.800*5, 0, [0, 1.800*5, 1, 2, 1]));


//-----------------------------
// creating an array of synth objects (synthArray), based on note objects (noteArray)
//-----------------------------

var synthArray = [];

for (var i=0;i<noteArray.length;i++){
  options = {
    vibratoAmount:1,
    vibratoRate:5,
    harmonicity:4,
    voice0:{
      volume:-30,
      portamento:0,
      oscillator:{
        type:"sine"
      },
      filterEnvelope:{
        attack:0.01,
        decay:0,
        sustain:0.5,
        release:1,
      },
      envelope:{
        attack:0.1,
        decay:0,
        sustain:0.5,
        release:1,
      },
    },
  voice1:{
    volume:-30,
    portamento:0,
    oscillator:{
      type:"sine"
    },
    filterEnvelope:{
      attack:0.01,
      decay:0,
      sustain:1,
      release:0.5,
    },
    envelope:{
      attack:0.01,
      decay:0,
      sustain:0.5,
      release:1,
    }
  }
  };
  synthArray.push(new Tone.DuoSynth(options).toMaster());
}

//-----------------------------
// low-tech visualization
//-----------------------------
b = new Tone.Meter ("signal");
synthArray[1].connect(b);
// synthArray[2].connect(b);

function backColorSignal(){
  if (b.value === 0){
    setTimeout(backColorBlue, 100);
  } else {
    var color = "rgba(0, 0, 255," + b.value + ")";
    $("html").css("background-color", color);
    setTimeout(backColorSignal, 100);
    // console.log('b.value: ' + b.value + " " + color);
  }
}

function backColorBlue(){
  var color = "rgba(0, 0, 255,1)";
  $("html").css("background-color", color);
  setTimeout(backColorSignal, 100);
}

 

Published by

Dror Ayalon

@drorayalon

Leave a Reply

Your email address will not be published. Required fields are marked *