Video version here.
This is a tiny bacterium. 3.5 billion years later, we have an estimated 8.7 million species on Earth, thanks to evolution!
Today, we’ll witness evolution like never before — in a musical fashion. We obtain a bacterium from EEG and evolve it with MuseNet. This will be a short tutorial where you can quickly understand and implement with hardware.
Manufacturing musical bacteria
Here we build the DNA of our musical bacterium note-by-note.
Step 1: Set up.
The brain is a generator of bioelectricity. We can sense the tiny changes in voltage with surface electrodes. After amplifying the signals using the OpenBCI Ganglion board, we get a graph called the EEG.
Here is the set up for the OpenBCI Ganglion board using the Brainflow library.
Step 2: Filter biosignals
As muscles, heart and eyeballs are also sources of bioelectricity, there would be noise in the EEG data. By the same token, the power lines in North America typically produce a 50 or 60 Hz noise.
Here, we filter environmental noise, and apply a bandpass filter to only allow frequencies from 7–10 Hz.
Step 3: Convert into numbers
The Ganglion board samples biosignals at 200 Hz, so in the short period of 10 seconds, there are already 2000 samples. We reduce the output into 80 samples by looking at every 25 samples.
Each of these 80 samples will be converted into a number, which codes for a note in the MIDI convention.
The scale used is C Locrian mode, sometimes used in jazz improv.
Step 4: Writing a MIDI file
Finally, we convert all 80 numbers into notes and write a MIDI file, totalling somewhere around 19 seconds.
Selective Breeding
10000 years ago, “corn” was a weedy grass. After harvessting, farmers will choose the largest kernel as the seeds for the next generation. This is repeated for many generations, forming the corn that we eat today.
Likewise, we pass our musical bacterium into MuseNet to produce 4 offsprings. Each offspring has been elongated by about 10 seconds. The most pleasant sounding one will be selected and thrown into the next generation.
This is repeated for 10 generations into a song.
MuseNet is an AI songmaker or extender. It wasn’t programmed with knowledge of music theory, but instead learned through millions of MIDI samples. This strategy was based on GPT-2, an old model that predicted the next sentence in a paragraph, released in 2019.
In the MuseNet interface that allowed the public to easily interact with the model, we can choose from 600+ genres from Bach to video games to radiohead, as well as change the instrumentation. We can also tune the temperature, truncation and generation length.
Creatures showcase
Creature 1: “Queen” taberna
(2:03 of video)
The starting genre was Queen, a British rock band formed in London in 1970. The genre changed to video games near the 8th generation and the creature evolved rapidly.
In the video, we could witness the emergence of a clear melody and accompaniment just in the first generation, and soon repetition and addition of strings. However, changing the genre to video games completely erased the memory of previous melody.
Creature 2: Verruca rameau
(Starting at 05:04)
Rameau was the starting genre. Octaves and note durations mutated in the first generation. In the second, chords were introduced, but were not selected to breed. Near the middle of the piece, we can hear trills and scales in C Locrian.