Details
Original language | English |
---|---|
Title of host publication | Proceedings of the 10th International Society for Music Information Retrieval Conference, ISMIR 2009 |
Pages | 657-662 |
Number of pages | 6 |
Publication status | Published - 2009 |
Event | 10th International Society for Music Information Retrieval Conference, ISMIR 2009 - Kobe, Japan Duration: 26 Oct 2009 → 30 Oct 2009 |
Publication series
Name | Proceedings of the 10th International Society for Music Information Retrieval Conference, ISMIR 2009 |
---|
Abstract
Music perception is highly intertwined with both emotions and context. Not surprisingly, many of the users' information seeking actions aim at retrieving music songs based on these perceptual dimensions - moods and themes, expressing how people feel about music or which situations they associate it with. In order to successfully support music retrieval along these dimensions, powerful methods are needed. Still, most existing approaches aiming at inferring some of the songs' latent characteristics focus on identifying musical genres. In this paper we aim at bridging this gap between users' information needs and indexed music features by developing algorithms for classifying music songs by moods and themes. We extend existing approaches by also considering the songs' thematic dimensions and by using social data from the Last.fm music portal, as support for the classification tasks. Our methods exploit both audio features and collaborative user annotations, fusing them to improve overall performance. Evaluation performed against the AllMusic.com ground truth shows that both kinds of information are complementary and should be merged for enhanced classification accuracy.
ASJC Scopus subject areas
- Arts and Humanities(all)
- Music
- Computer Science(all)
- Information Systems
Cite this
- Standard
- Harvard
- Apa
- Vancouver
- BibTeX
- RIS
Proceedings of the 10th International Society for Music Information Retrieval Conference, ISMIR 2009. 2009. p. 657-662 (Proceedings of the 10th International Society for Music Information Retrieval Conference, ISMIR 2009).
Research output: Chapter in book/report/conference proceeding › Conference contribution › Research › peer review
}
TY - GEN
T1 - Music mood and theme classification
T2 - 10th International Society for Music Information Retrieval Conference, ISMIR 2009
AU - Bischoff, Kerstin
AU - Firan, Claudiu S.
AU - Paiu, Raluca
AU - Nejdl, Wolfgang
AU - Laurier, Cyril
AU - Sordo, Mohamed
PY - 2009
Y1 - 2009
N2 - Music perception is highly intertwined with both emotions and context. Not surprisingly, many of the users' information seeking actions aim at retrieving music songs based on these perceptual dimensions - moods and themes, expressing how people feel about music or which situations they associate it with. In order to successfully support music retrieval along these dimensions, powerful methods are needed. Still, most existing approaches aiming at inferring some of the songs' latent characteristics focus on identifying musical genres. In this paper we aim at bridging this gap between users' information needs and indexed music features by developing algorithms for classifying music songs by moods and themes. We extend existing approaches by also considering the songs' thematic dimensions and by using social data from the Last.fm music portal, as support for the classification tasks. Our methods exploit both audio features and collaborative user annotations, fusing them to improve overall performance. Evaluation performed against the AllMusic.com ground truth shows that both kinds of information are complementary and should be merged for enhanced classification accuracy.
AB - Music perception is highly intertwined with both emotions and context. Not surprisingly, many of the users' information seeking actions aim at retrieving music songs based on these perceptual dimensions - moods and themes, expressing how people feel about music or which situations they associate it with. In order to successfully support music retrieval along these dimensions, powerful methods are needed. Still, most existing approaches aiming at inferring some of the songs' latent characteristics focus on identifying musical genres. In this paper we aim at bridging this gap between users' information needs and indexed music features by developing algorithms for classifying music songs by moods and themes. We extend existing approaches by also considering the songs' thematic dimensions and by using social data from the Last.fm music portal, as support for the classification tasks. Our methods exploit both audio features and collaborative user annotations, fusing them to improve overall performance. Evaluation performed against the AllMusic.com ground truth shows that both kinds of information are complementary and should be merged for enhanced classification accuracy.
UR - http://www.scopus.com/inward/record.url?scp=84862927717&partnerID=8YFLogxK
M3 - Conference contribution
AN - SCOPUS:84862927717
SN - 9780981353708
T3 - Proceedings of the 10th International Society for Music Information Retrieval Conference, ISMIR 2009
SP - 657
EP - 662
BT - Proceedings of the 10th International Society for Music Information Retrieval Conference, ISMIR 2009
Y2 - 26 October 2009 through 30 October 2009
ER -