Transpose midi with python for computational creativity in the domain of music

I have been spending some time putting into practice the ideas that I’ve been developing with Prof John Gero around the role of interpretation in computational creativity, in the domain of Python generated MIDI music. This post shows how to transpose MIDI with python as well as setting the scene for more research to follow.

There is a great tutorial from the Deep Learning team on how to train a system to produce music from examples – however, the examples from the code in the tutorial are fairly awful musically, as the system is learning on songs written in many different keys (the examples in the zip file at the bottom of the page with improvements are far more interesting musically).

It is beneficial to train the system with all files in the same key so that they create harmonies rather than discordance (as they suggest on that linked page – check the difference between the files with and without this).

There is a script below to use Python to transpose MIDI files into C major. Compare the two samples on the tutorial page with these two following transposition into a standard key: sample1 and sample2

The piano rolls for these are respectively:

figure_1

figure_2

The point of this post is to share this script for anyone following on in wanting to transpose MIDI files using python (script below) as well as to set the scene for more research to follow in this domain.

Having replicated what the DeepLearning people have done, the challenge is to:

1) set up a limited conceptual space within which the system generates; and
2) have it go through a phase of interpretation (currently not in the algorithm at all) where it can change this conceptual space

The reason is that my work has suggested that generation occurs within a limited conceptual space, whereas interpretation uses the breadth of experience. This domain of music is useful for demonstrating this – I will post more examples of python midi music as it gets developed.

Transpose MIDI with Python

4 thoughts on “Transpose midi with python for computational creativity in the domain of music”

  1. Just what I needed and didn’t have to code myself. Thanks.
    Your code seems to lack sharp keys, I fixed it in my machine with:

    majors = dict([(“A-“, 4),(“G#”, 4),(“A”, 3),(“A#”, 2),(“B-“, 2),(“B”, 1),(“C”, 0),(“C#”, -1),(“D-“, -1),(“D”, -2),(“D#”, -3),(“E-“, -3),(“E”, -4),(“F”, -5),(“F#”, 6),(“G-“, 6),(“G”, 5)])
    minors = dict([(“G#”, 1), (“A-“, 1),(“A”, 0),(“A#”, -1),(“B-“, -1),(“B”, -2),(“C”, -3),(“C#”, -4),(“D-“, -4),(“D”, -5),(“D#”, 6),(“E-“, 6),(“E”, 5),(“F”, 4),(“F#”, 3),(“G-“, 3),(“G”, 2)])

    1. Thanks Hannu, great to see improvements. If your code is online somewhere, feel free to post a link here in the comments for others that are following.
      Nick

  2. Hey there,

    I wonder did you alter the program or just keep it same as the original (i.e. 200 loops for both training & testing, 150 RBM nodes and 100 RNN nodes) ?

    Cheers mate!

    1. Hi Elliot, yes, I did just leave it with the default settings for training and testing – I didn’t get too deep into tweaking the network, it was just as a proof of concept.
      Best,
      Nick

Leave a Reply

Your email address will not be published. Required fields are marked *