Music mood and theme classification: A hybrid approach

Publikation: Beitrag in Buch/Bericht/Sammelwerk/KonferenzbandAufsatz in KonferenzbandForschungPeer-Review

Autoren

  • Kerstin Bischoff
  • Claudiu S. Firan
  • Raluca Paiu
  • Wolfgang Nejdl
  • Cyril Laurier
  • Mohamed Sordo

Organisationseinheiten

Externe Organisationen

  • Universität Pompeu Fabra (UPF)
Forschungs-netzwerk anzeigen

Details

OriginalspracheEnglisch
Titel des SammelwerksProceedings of the 10th International Society for Music Information Retrieval Conference, ISMIR 2009
Seiten657-662
Seitenumfang6
PublikationsstatusVeröffentlicht - 2009
Veranstaltung10th International Society for Music Information Retrieval Conference, ISMIR 2009 - Kobe, Japan
Dauer: 26 Okt. 200930 Okt. 2009

Publikationsreihe

NameProceedings of the 10th International Society for Music Information Retrieval Conference, ISMIR 2009

Abstract

Music perception is highly intertwined with both emotions and context. Not surprisingly, many of the users' information seeking actions aim at retrieving music songs based on these perceptual dimensions - moods and themes, expressing how people feel about music or which situations they associate it with. In order to successfully support music retrieval along these dimensions, powerful methods are needed. Still, most existing approaches aiming at inferring some of the songs' latent characteristics focus on identifying musical genres. In this paper we aim at bridging this gap between users' information needs and indexed music features by developing algorithms for classifying music songs by moods and themes. We extend existing approaches by also considering the songs' thematic dimensions and by using social data from the Last.fm music portal, as support for the classification tasks. Our methods exploit both audio features and collaborative user annotations, fusing them to improve overall performance. Evaluation performed against the AllMusic.com ground truth shows that both kinds of information are complementary and should be merged for enhanced classification accuracy.

ASJC Scopus Sachgebiete

Zitieren

Music mood and theme classification: A hybrid approach. / Bischoff, Kerstin; Firan, Claudiu S.; Paiu, Raluca et al.
Proceedings of the 10th International Society for Music Information Retrieval Conference, ISMIR 2009. 2009. S. 657-662 (Proceedings of the 10th International Society for Music Information Retrieval Conference, ISMIR 2009).

Publikation: Beitrag in Buch/Bericht/Sammelwerk/KonferenzbandAufsatz in KonferenzbandForschungPeer-Review

Bischoff, K, Firan, CS, Paiu, R, Nejdl, W, Laurier, C & Sordo, M 2009, Music mood and theme classification: A hybrid approach. in Proceedings of the 10th International Society for Music Information Retrieval Conference, ISMIR 2009. Proceedings of the 10th International Society for Music Information Retrieval Conference, ISMIR 2009, S. 657-662, 10th International Society for Music Information Retrieval Conference, ISMIR 2009, Kobe, Japan, 26 Okt. 2009.
Bischoff, K., Firan, C. S., Paiu, R., Nejdl, W., Laurier, C., & Sordo, M. (2009). Music mood and theme classification: A hybrid approach. In Proceedings of the 10th International Society for Music Information Retrieval Conference, ISMIR 2009 (S. 657-662). (Proceedings of the 10th International Society for Music Information Retrieval Conference, ISMIR 2009).
Bischoff K, Firan CS, Paiu R, Nejdl W, Laurier C, Sordo M. Music mood and theme classification: A hybrid approach. in Proceedings of the 10th International Society for Music Information Retrieval Conference, ISMIR 2009. 2009. S. 657-662. (Proceedings of the 10th International Society for Music Information Retrieval Conference, ISMIR 2009).
Bischoff, Kerstin ; Firan, Claudiu S. ; Paiu, Raluca et al. / Music mood and theme classification : A hybrid approach. Proceedings of the 10th International Society for Music Information Retrieval Conference, ISMIR 2009. 2009. S. 657-662 (Proceedings of the 10th International Society for Music Information Retrieval Conference, ISMIR 2009).
Download
@inproceedings{54b4d0b158554a80b4641bea12244d9a,
title = "Music mood and theme classification: A hybrid approach",
abstract = "Music perception is highly intertwined with both emotions and context. Not surprisingly, many of the users' information seeking actions aim at retrieving music songs based on these perceptual dimensions - moods and themes, expressing how people feel about music or which situations they associate it with. In order to successfully support music retrieval along these dimensions, powerful methods are needed. Still, most existing approaches aiming at inferring some of the songs' latent characteristics focus on identifying musical genres. In this paper we aim at bridging this gap between users' information needs and indexed music features by developing algorithms for classifying music songs by moods and themes. We extend existing approaches by also considering the songs' thematic dimensions and by using social data from the Last.fm music portal, as support for the classification tasks. Our methods exploit both audio features and collaborative user annotations, fusing them to improve overall performance. Evaluation performed against the AllMusic.com ground truth shows that both kinds of information are complementary and should be merged for enhanced classification accuracy.",
author = "Kerstin Bischoff and Firan, {Claudiu S.} and Raluca Paiu and Wolfgang Nejdl and Cyril Laurier and Mohamed Sordo",
year = "2009",
language = "English",
isbn = "9780981353708",
series = "Proceedings of the 10th International Society for Music Information Retrieval Conference, ISMIR 2009",
pages = "657--662",
booktitle = "Proceedings of the 10th International Society for Music Information Retrieval Conference, ISMIR 2009",
note = "10th International Society for Music Information Retrieval Conference, ISMIR 2009 ; Conference date: 26-10-2009 Through 30-10-2009",

}

Download

TY - GEN

T1 - Music mood and theme classification

T2 - 10th International Society for Music Information Retrieval Conference, ISMIR 2009

AU - Bischoff, Kerstin

AU - Firan, Claudiu S.

AU - Paiu, Raluca

AU - Nejdl, Wolfgang

AU - Laurier, Cyril

AU - Sordo, Mohamed

PY - 2009

Y1 - 2009

N2 - Music perception is highly intertwined with both emotions and context. Not surprisingly, many of the users' information seeking actions aim at retrieving music songs based on these perceptual dimensions - moods and themes, expressing how people feel about music or which situations they associate it with. In order to successfully support music retrieval along these dimensions, powerful methods are needed. Still, most existing approaches aiming at inferring some of the songs' latent characteristics focus on identifying musical genres. In this paper we aim at bridging this gap between users' information needs and indexed music features by developing algorithms for classifying music songs by moods and themes. We extend existing approaches by also considering the songs' thematic dimensions and by using social data from the Last.fm music portal, as support for the classification tasks. Our methods exploit both audio features and collaborative user annotations, fusing them to improve overall performance. Evaluation performed against the AllMusic.com ground truth shows that both kinds of information are complementary and should be merged for enhanced classification accuracy.

AB - Music perception is highly intertwined with both emotions and context. Not surprisingly, many of the users' information seeking actions aim at retrieving music songs based on these perceptual dimensions - moods and themes, expressing how people feel about music or which situations they associate it with. In order to successfully support music retrieval along these dimensions, powerful methods are needed. Still, most existing approaches aiming at inferring some of the songs' latent characteristics focus on identifying musical genres. In this paper we aim at bridging this gap between users' information needs and indexed music features by developing algorithms for classifying music songs by moods and themes. We extend existing approaches by also considering the songs' thematic dimensions and by using social data from the Last.fm music portal, as support for the classification tasks. Our methods exploit both audio features and collaborative user annotations, fusing them to improve overall performance. Evaluation performed against the AllMusic.com ground truth shows that both kinds of information are complementary and should be merged for enhanced classification accuracy.

UR - http://www.scopus.com/inward/record.url?scp=84862927717&partnerID=8YFLogxK

M3 - Conference contribution

AN - SCOPUS:84862927717

SN - 9780981353708

T3 - Proceedings of the 10th International Society for Music Information Retrieval Conference, ISMIR 2009

SP - 657

EP - 662

BT - Proceedings of the 10th International Society for Music Information Retrieval Conference, ISMIR 2009

Y2 - 26 October 2009 through 30 October 2009

ER -

Von denselben Autoren