Evolving in-game mood-expressive music with MetaCompose
conference contribution
posted on 2018-09-12, 00:00authored byMarco Scirea, Peter Eklund, Julian Togelius, Sebastien Risi
MetaCompose is a music generator based on a hybrid evolu- tionary technique that combines FI-2POP and multi-objective optimization. In this paper we employ the MetaCompose mu- sic generator to create music in real-time that expresses dif- ferent mood-states in a game-playing environment (Check- ers). In particular, this paper focuses on determining if di er- ences in player experience can be observed when: (i) using a ective-dynamic music compared to static music, and (ii) the music supports the game’s internal narrative/state. Partic- ipants were tasked to play two games of Checkers while lis- tening to two (out of three) di erent set-ups of game-related generated music. The possible set-ups were: static expres- sion, consistent a ective expression, and random a ective expression. During game-play players wore a E4 Wristband, allowing various physiological measures to be recorded such as blood volume pulse (BVP) and electromyographic activ- ity (EDA). The data collected con rms a hypothesis based on three out of four criteria (engagement, music quality, coherency with game excitement, and coherency with per- formance) that players prefer dynamic a ective music when asked to re ect on the current game-state. In the future this system could allow designers/composers to easily create af- fective and dynamic soundtracks for interactive applications