I have recently published an artwork that changes the meaning of any text that you put into it. Please read the associated conversation article for further information.
The artwork was featured in the Kyogle Writers’ Festival, in the Roxy Gallery.
I have recently published an artwork that changes the meaning of any text that you put into it. Please read the associated conversation article for further information.
The artwork was featured in the Kyogle Writers’ Festival, in the Roxy Gallery.
The Python script below provides a demonstration of generate and interpret.
This page has a link to it in a to-be-published journal paper explaining what it’s about (will be linked following publication).
For the time being, it is described as:
Generating:
Consider a simple system that generates numbers (after Kelly and Gero 2015). The system has two variables: firstly n, a range of numbers with constraints; and p designates an operator, one of multiplication, division, addition or subtraction. The system has an initial state, which for the sake of example we can take to be constrained by 5<n<95 and p=+ the operator for addition.
The system generates by choosing two values for the variable within constraints and performing the operation on them to produce an artefact (the resulting number). For example, the system chooses first 6 and then 72 and adds them to produce an artefact of 78. With this setup the system clearly has a bounded space that it is ‘searching’ through generation and given enough time it would eventually ‘discover’ all possible artefacts in the space, i.e. {10,11,12,…,189,190}.
This extremely simple proposal is representative of systems that have a clearly defined grammar and perform search through application of that grammar, i.e. an example of routine design.
Interpreting
Through additional rules, the system is able to explore. After producing an artefact, the system interprets what it has produced such that:
The significance of this will be in the paper, but essentially it’s representative of the way that people undertake creative tasks. We have vast amounts of experience, but only access some of it at a time. This kind of generate-and-interpret movement is a suggestion for how we move around within our own experiences, stimulated by interpreting what it is that we’ve created:
I have been spending some time putting into practice the ideas that I’ve been developing with Prof John Gero around the role of interpretation in computational creativity, in the domain of Python generated MIDI music. This post shows how to transpose MIDI with python as well as setting the scene for more research to follow.
There is a great tutorial from the Deep Learning team on how to train a system to produce music from examples – however, the examples from the code in the tutorial are fairly awful musically, as the system is learning on songs written in many different keys (the examples in the zip file at the bottom of the page with improvements are far more interesting musically).
It is beneficial to train the system with all files in the same key so that they create harmonies rather than discordance (as they suggest on that linked page – check the difference between the files with and without this).
There is a script below to use Python to transpose MIDI files into C major. Compare the two samples on the tutorial page with these two following transposition into a standard key: sample1 and sample2
The piano rolls for these are respectively:
The point of this post is to share this script for anyone following on in wanting to transpose MIDI files using python (script below) as well as to set the scene for more research to follow in this domain.
Having replicated what the DeepLearning people have done, the challenge is to:
1) set up a limited conceptual space within which the system generates; and
2) have it go through a phase of interpretation (currently not in the algorithm at all) where it can change this conceptual space
The reason is that my work has suggested that generation occurs within a limited conceptual space, whereas interpretation uses the breadth of experience. This domain of music is useful for demonstrating this – I will post more examples of python midi music as it gets developed.