Project Timeline

2020

Project Conception

This marks the beginning of the project. Initially, the goal was to use electroencephalography and computer programming techniques to create generative art that reflected the emotions felt by a listener, while listening to music.

First EEG Connection

The first successful retrieval of data from the EEG headset. Similar integrations had been attempted by the investigator before, but all were in visual coding environments. This was the first time the data had been recorded using just code. Namely, Java in the Processing environment, using the oscP5 library.

Programming Research Begins

Once the EEG data was accessible, the use of coding for generative art production was explored. Many languages and environments were investigated. However, the most notable were Python, Java and JavaScript. Ultimately, JavaScript was decided to be the most viable, for its powerful drawing capabilities using the P5 library.

Nature of Code

At this point, the book “Nature of Code” was explored. This book was key to the development of the project; it taught the investigator how to replicate natural systems using code, and provoked the idea of using neural networks for emotion classification. This lead to the creation of the first particle system prototype artworks. See attached…Read More

Mind Charity Fundraising

The investigator liaised with organisers of the Mind mental health charity, to negotiate a fundraiser event to take place during the project exhibition in July. This would be used as an opportunity to receive charitable donations, in exchange for material goods produced during the project (i.e., artwork prints, artwork booklets, etc.).

Project Concept Presentation

On this day, presentations of work-in-progress and project concepts were given. A PowerPoint presentation was prepared for this event which detailed all the technical points that had been investigated so far – along with future plans, finance planning and charity goals. See slides attached.

Key Literature Reviewed

This month, many key documents and papers were read and reviewed. These articles included both practical and theory based research papers, explanatory books, and other literature reviews. These documents provided a solid foundation for the project to be planned from, and confirmed the investigator’s interest in using machine learning for emotion classification. See bibliography list attached…Read More

Non-Neural Network Alternative

As a fallback option, if the investigator was not able to actualise the machine learning based system that was envisioned, an alternative setup was created. This alternative system was based on the emotion detection methods used in the “Art of Feeling” project by the studio Random Quark (2017). In principal, a reasonably simple algorithm is applied..Read More

First Neural Network Created

This marks the first successful creation and training of a simple neural network for data-graph classification. The user places dots with a letter value on a canvas, and the network learns the rules of the placement pattern. The network can then generate a later prediction for any position next selected by the user. See image attached…Read More

EEG ‘Spectrogram’ Created

Using the Processing environment for Java, a program was created that converts the live EEG data into spectrogram-like images. These images represent a three second sliding window of each bandwidth amplitude at each electrode. This method both visually represents the data, and converts it to a format that a convolutional neural network can understand. See image..Read More

Blink Detection Network Created

A model that had the ability to predict when a participant’s eyes were closed was created. This was done using a convolutional neural network and the EEG ‘spectrogram’ system created earlier. The EEG spectrogram program was also rewritten to run in JavaScript, rather than Java. The model accuracy confirmed the spectrogram approach was viable. See attached..Read More

2021

New Supervisor

At this point, the initial project supervisor announced their plans to go on research leave for their post-doc studies. A new supervisor was assigned to the project, and presentations were given to bring them up to date. See slides attached.

Emotion Neural Network Created

A monumental milestone in the project. The first emotion detection network was created, which used the same technique as the blink detection network. To train the model, EEG data was recorded while the investigator was listening to music that was thought to provoke states of high activation, low activation, high valence, and low valence. See attached..Read More

Emotion Neural Network Progressed

The network UI was redesigned to include a very simple graph that exhibited the current emotion of the participant. This was based on the circumplex model of affect, as suggested by Posner et al (2005). The model was also made to broadcast the values it generates over OSC, to be used in other applications. See video..Read More

Electron.js Researched

In coding research, the investigator discovered the Electron.js framework for Node.js JavaScript. This framework allows JavaScript code, which would typically only work in internet browsers, to run in standalone desktop applications through Node.js. Upon learning this, the investigator began working on building the emotion detection system into this framework.

Neural Scores Application Created

Using the Electron.js framework, the investigator rebuilt the emotion classification network as a standalone desktop application that could be installed easily. At this point, the emotion graph featured previously was also redesigned to a circumplex graph superimposed over an emotion colour wheel. This is a further exploration of the circumplex model of affect. See attached images…Read More

Submitted Work to a Call for Case Studies

As part of the ongoing research conducted by the Creative AI Lab, founded by Bunz and Jager, a call for case studies was put out by Serpentine Galleries. In this posting, the investigators requested that artists who work with artificial intelligence provide them with images and explanations of the internal tooling of their systems. See images..Read More

Frontend Research Begins

The investigator begins researching ways of artistically representing the data generated by the Neural Scores application. During this time, many programming environments were explored. However, the investigator decided to build the frontend system in TouchDesigner with Python. This framework was chosen for its powerful shader-based visual rendering capabilities.

Ethics Application and Risk Assessment

In accordance with the university’s policies surrounding participant-based research, an ethics application form was completed. This document also includes the participant information and consent forms, and the risk assessment. The form was submitted to the project supervisor and approved by them after review. See document attached.    

yop3rro Inspiration

The investigator discovered an independent digital artist, named yop3rro, who creates animated posters using TouchDesigner. These posters are displayed on an Instagram page, where they remain still until interacted with – causing them to animate. This method of digital presentation heavily inspired the final output of the project.

Frontend Prototype v1

Using the information and experience gained through research, the investigator created the first prototype of the frontend emotion rendering system. The system, made in TouchDesigner, displayed the current emotion of the user as a series of undulating coloured lines. The colour of the lines reflected the colour currently selected on the emotion graph of the Neural..Read More

Interim Report Submitted

During this period, an interim project report was created and submitted for marking. The report detailed many aspects of the project, such as research, methodology, and current/planned work. The report received a mark of 78, losing marks only on the evaluation. This report went on to provide others with a comprehensive description who held interest in..Read More

EVA 2021 Conference Paper

Under the recommendation of his tutor, the investigator wrote a short paper to submit for the 2021 Electronic Visualisation and the Arts (EVA) conference. On this day, the paper was accepted. In addition to publishing a paper, the investigator will be giving a 15 minute presentation at the conference, in June. See paper attached.

Frontend Prototype v2

The frontend prototype was further developed to produce this version. In short, the lines seen previously were blurred and put into a feedback loop. Then, the resultant image was heavily distorted, using parallel displacement. This resulted in an attractive marbling visual effect, which displayed a short history of the emotions – aprox. 3-7 seconds. See images..Read More

Frontend System Complete

This was perhaps the most anticipated milestone of the project – the artistic realisation. This system was the product of further development upon the v2 prototype. In this version, an emotion-EEG graph was added to the centre of the image. This graph, which forms as a circle, displayed both the emotion experienced (as colour) and the..Read More

All Participants Recorded

Using an EEG device and the Neural Scores application, brainwave recordings were taken of two volunteering participants. These recordings were used to train the emotion recognition systems and to generate the final artistic output of the project.

All Renders Completed

Using the trained emotion recognition system, artistic rendered were made using live EEG data from the participants. The renders were generated at 4k resolution, and at 30 frames per second. A still image was also generated for each song recording, as well as a standalone neural score graph.

Virtual Gallery Completed

Due to the global pandemic, the in-person exhibition was cancelled. Therefore, a new approach to presentation was necessary. After all the recordings had been rendered, an online virtual gallery was build by the investigator. This was made using Unity’s WebGL functionality. In the gallery, the neural score images were displayed on large screens, which show a..Read More

Website Completed

On this day, the investigator completed the development of both the website and the portfolio.