Interactivity, Game Creation, Design, Learning, and Innovation. 5th International Conference, ArtsIT 2016, and First International Conference, DLI 2016, Esbjerg, Denmark, May 2–3, 2016, Proceedings

Research Article

Multimodal Detection of Music Performances for Intelligent Emotion Based Lighting

Download
241 downloads
  • @INPROCEEDINGS{10.1007/978-3-319-55834-9_25,
        author={Esben Oxholm and Ellen Hansen and Georgios Triantafyllidis},
        title={Multimodal Detection of Music Performances for Intelligent Emotion Based Lighting},
        proceedings={Interactivity, Game Creation, Design, Learning, and Innovation. 5th International Conference, ArtsIT 2016, and First International Conference, DLI 2016, Esbjerg, Denmark, May 2--3, 2016, Proceedings},
        proceedings_a={ARTSIT \& DLI},
        year={2017},
        month={3},
        keywords={Multimodal detection Emotion-based lighting},
        doi={10.1007/978-3-319-55834-9_25}
    }
    
  • Esben Oxholm
    Ellen Hansen
    Georgios Triantafyllidis
    Year: 2017
    Multimodal Detection of Music Performances for Intelligent Emotion Based Lighting
    ARTSIT & DLI
    Springer
    DOI: 10.1007/978-3-319-55834-9_25
Esben Oxholm1,*, Ellen Hansen1,*, Georgios Triantafyllidis1,*
  • 1: Aalborg University Copenhagen
*Contact email: ebonde14@student.aau.dk, ekh@create.aau.dk, gt@create.aau.dk

Abstract

Playing music is about conveying emotions and the lighting at a concert can help do that. However, new and unknown bands that play at smaller venues and bands that don’t have the budget to hire a dedicated light technician have to miss out on lighting that will help them to convey the emotions of what they play. In this paper it is investigated whether it is possible or not to develop an intelligent system that through a multimodal input detects the intended emotions of the played music and in real-time adjusts the lighting accordingly. A concept for such an intelligent lighting system is developed and described. Through existing research on music and emotion, as well as on musicians’ body movements related to the emotion they want to convey, a row of cues is defined. This includes amount, speed, fluency and regularity for the visual and level, tempo, articulation and timbre for the auditory. Using a microphone and a Kinect camera to detect such cues, the system is able to detect the intended emotion of what is being played. Specific lighting designs are then developed to support the specific emotions and the system is able to change between and alter the lighting design based on the incoming cues. The results suggest that the intelligent emotion-based lighting system has an advantage over a just beat synced lighting and it is concluded that there is reason to explore this idea further.