I have been spending some time putting into practice the ideas that I’ve been developing with Prof John Gero around the role of interpretation in computational creativity, in the domain of Python generated MIDI music. This post shows how to transpose MIDI with python as well as setting the scene for more research to follow.
There is a great tutorial from the Deep Learning team on how to train a system to produce music from examples – however, the examples from the code in the tutorial are fairly awful musically, as the system is learning on songs written in many different keys (the examples in the zip file at the bottom of the page with improvements are far more interesting musically).
It is beneficial to train the system with all files in the same key so that they create harmonies rather than discordance (as they suggest on that linked page – check the difference between the files with and without this).
There is a script below to use Python to transpose MIDI files into C major. Compare the two samples on the tutorial page with these two following transposition into a standard key: sample1 and sample2
The piano rolls for these are respectively:
The point of this post is to share this script for anyone following on in wanting to transpose MIDI files using python (script below) as well as to set the scene for more research to follow in this domain.
Having replicated what the DeepLearning people have done, the challenge is to:
1) set up a limited conceptual space within which the system generates; and
2) have it go through a phase of interpretation (currently not in the algorithm at all) where it can change this conceptual space
The reason is that my work has suggested that generation occurs within a limited conceptual space, whereas interpretation uses the breadth of experience. This domain of music is useful for demonstrating this – I will post more examples of python midi music as it gets developed.