Now A Machine Learns to Map Music-induced Movement To Your Traits

Just like physical gestures are quick giveaways of your personality and your current emotional state, the way you groove to music also says a lot about you. In a new study that has significant implications for music cognition research, scientists at the International Institute of Information Technology, Hyderabad (IIITH) have developed a machine learning model that can look at listeners’ natural movement to music and predict their personalities and cognitive styles. 

We live in a “streaming era” where listening to music means listening to it online. Streaming services not only give you access to endless hours of your choice of music, but they are also designed to help you find other (similar) music you may like, “recommending” choices and auto-generating playlists with the help of sophisticated machine learning algorithms. Existing music retrieval systems rely on content and context-based information related to music alone, such as acoustic features, lyrics, artists, their cultural background and so on. However, it is now widely understood that there are other external user-related factors which influence the manner in which a listener perceives and responds to music. “Natural swaying of the body and movement is a common response to music. And based on individuals’ movements to music, we can enhance their listening experience and make recommendations on what kind of music they might like in the future,” say Prof. Petri Toiviainen, University of Jyväskylä, Finland and Dr. Vinoo Alluri, who leads music research at IIITH.

What They Did

Previous music research has found associations between dance movements and personality traits especially in the Extraversion and Neuroticism dimensions. For instance, those who scored high on the Extraversion dimension tended to be (no surprise here) more energetic and expressive, dominating the dance floor. And those who scored high on Neuroticism exhibited jerky movements while remaining confined to a small area. In the current study which has been accepted at International Society for Music Information Retrieval Conference (ISMIR), the IIITH team automated a machine learning model to investigate the same. The idea was to study music-induced movement patterns that could predict individual traits, which could then be linked to music preferences and recommendations. For this, in a collaborative effort with the Department of Music, Arts and Culture, University of Jyväskylä, Jyväskylä, Finland, participants’ personality and cognitive styles were assessed. The Big Five model was used where personality is ranked in terms of one’s Openness to Experience, Conscientiousness, Extraversion, Agreeableness, and Neuroticism. Next, for cognitive styles, they were measured on the EQ-short and SQ-short questionnaires – both instruments for measuring cognitive styles. (EQ or the Empathizing Quotient is the measure of an individual’s drive to identify another person’s emotions and thoughts and to respond to these with an appropriate emotion. SQ or the Systemizing Quotient is a measure of an individual’s drive to analyze or construct systems in a bid to understand the rules that govern them). The participants were asked to move naturally to music selected across 8 different genres, ranging from the Blues to Hip hop to Jazz, and Pop. With the help of markers placed at various joints of the body, these movements were then recorded via motion capture cameras.

What They Found

From the motion capture data, the researchers used computational methods to obtain position and velocity for each of the joints marked. It was found that features extracted from dynamic variations of the position of the joints in relation to each other were more accurate in predicting individual-specific traits than the velocity at which those joints were moving. “The novel part of the study is the end-to-end architecture which uses natural movement to music to predict individual traits accurately and further explore particular joints which are relatively important in characterising those traits,” says Yudhik Agrawal, the first author of the study. For instance, relative movements of the head and toe were found to be highly associated with Extraversion, while negligible movements were found in the case of Neuroticism, reinforcing beliefs that dance-induced movements of such individuals are very limited. For determining EQ-SQ scores, the team discovered that relative movements of the lower body and extremities and how they move together play a greater role in predicting EQ. Similarly, the study found that the elbow holds significantly higher importance than other joints for predicting SQ than for EQ.

Therapeutic and Other Implications

Apart from building a more personalized music recommendation system with these traits mapped to movement patterns, this study serves as an initial step in the direction of autism research. Speaking of how persons with high functioning autism have been found to be extreme systemizers when compared to the general population (Baron-Cohen et al, 2003), Dr. Alluri says, “All participants in this study were healthy individuals. However if we know that certain movements might be more typical to extreme systemizers, there are broader implications of this research in the domain of Autism Spectrum Disorder (ASD).” Yudhik adds that the motion capture system setup holds much promise. “With the progress in the area of 3D human pose estimation in predicting the body joint coordinates, it can be extended to monocular video captured by accessible devices such as a mobile phone camera to make this approach applicable to personalized gesture-based retrieval systems,” he says.