Hyderabad: When you listen to music, are you swayed? Does it soothe your feelings, excite you or make you get up and dance or move a few steps?
By the way you react to music, say Bollywood songs, Beethoven’s symphony or classical music which you consume from streaming devices, it’s now possible to give a better idea of your personality as well as predict your inclination to the type of music.
Researchers of the Interntional Institute of Information Technology, Hyderabad ( IIIT-H) have developed a machine learning model that can observe the listeners natural reaction/movement to music and predict their personalities and cognitive styles. In an extended potential role, it can be used in autism research too, they feel.
According to the study, “Natural swaying of the body and movement is a common response to music. And based on individuals’ movements to music, we can enhance their listening experience and make recommendations on what kind of music they might like in the future.”
The common analogy that we can relate to is, how physical gestures are a giveaway to one’s personality and emotional state of the mind. Similarly, the way one grooves to music can also reveal a few personality aspects.
The study has been done by Prof. Petri Toiviainen, Dept of Music, Arts & Culture, University of Jyväskylä, Finland and Dr. Vinoo Alluri, who leads music research at IIIT-H and Yudhik Agrawal, the lead author.
The participants in the study were asked to move naturally to music selected across eight different genres, ranging from the Blues to Hip hop to Jazz and Pop. With the help of markers placed at various joints of the body, these movements were then recorded via motion capture cameras.
The idea was to study music-induced movement patterns that could predict individual traits, which could then be linked to music preferences and recommendations.
Yudhik adds that the motion capture system setup holds much promise. “With the progress in the area of 3D human pose estimation in predicting the body joint coordinates, it can be extended to monocular video captured by accessible devices such as a mobile phone camera to make this approach applicable to personalised gesture-based retrieval systems,” he says.
Existing music retrieval systems rely on content and context-based information related to music alone, such as acoustic features, lyrics, artists, their cultural background and so on.
However, it is becoming clear that there are other external user-related factors which influence the manner in which a listener perceives and responds to music.
We live in a “streaming era” where listening to music means listening to it online. Streaming services not only give you access to endless hours of your choice of music, but they are also designed to help you find other (similar) music you may like, “recommending” choices and auto-generating playlists with the help of sophisticated machine learning algorithms.
Apart from building a more personalized music recommendation system with these traits mapped to movement patterns, this study serves as an initial step in the direction of autism spectrum disorder research.
There is available literature to show that some persons with high autism tend to be good at music & arts compared to general public.
Somasekhar Mulugu, former Associate Editor & Chief of Bureau of The Hindu BusinessLine, is a well-known political, business and science writer and analyst based in Hyderabad.