File(s) under permanent embargo

Can you feel it? Evaluation of affective expression in music generated by MetaCompose

conference contribution
posted on 2017-01-01, 00:00 authored by M Scirea, Peter EklundPeter Eklund, J Togelius, S Risi
This paper describes an evaluation conducted on the MetaCompose music generator, which is based on evolutionary computation and uses a hybrid evolutionary technique that combines FI-2POP and multi-objective optimization. The main objective of MetaCompose is to create music in real-time that can express different mood-states. The experiment presented here aims to evaluate: (i) if the perceived mood experienced by the participants of a music score matches intended mood the system is trying to express and (ii) if participants can identify transitions in the mood expression that occur mid-piece. Music clips including transitions and with static affective states were produced by MetaCompose and a quantitative user study was performed. Participants were tasked with annotating the perceived mood and moreover were asked to annotate in real-time changes in valence^ie data collected confirms the hypothesis that people can recognize changes in music mood and that MetaCompose can express perceptibly different levels of arousal. In regards to valence we observe that, while it is mainly perceived as expected, changes in arousal seems to also influence perceived valence, suggesting that one or more of the music features MetaCompose associates with arousal has some effect on valence as well.



Genetic and evolutionary computation. Conference (2017 : Berlin, Germany)


211 - 218


Association for Computing Machinery


Berlin, Germany

Place of publication

New York, N.Y.

Start date


End date






Publication classification

E Conference publication; E1.1 Full written paper - refereed

Copyright notice

2017, ACM.



Title of proceedings

GECCO 2017 - Proceedings of the 2017 Genetic and Evolutionary Computation Conference