Music mood and theme classification: A hybrid approach

Research output: Chapter in book/report/conference proceedingConference contributionResearchpeer review

Authors

  • Kerstin Bischoff
  • Claudiu S. Firan
  • Raluca Paiu
  • Wolfgang Nejdl
  • Cyril Laurier
  • Mohamed Sordo

Research Organisations

External Research Organisations

  • Universität Pompeu Fabra (UPF)
View graph of relations

Details

Original languageEnglish
Title of host publicationProceedings of the 10th International Society for Music Information Retrieval Conference, ISMIR 2009
Pages657-662
Number of pages6
Publication statusPublished - 2009
Event10th International Society for Music Information Retrieval Conference, ISMIR 2009 - Kobe, Japan
Duration: 26 Oct 200930 Oct 2009

Publication series

NameProceedings of the 10th International Society for Music Information Retrieval Conference, ISMIR 2009

Abstract

Music perception is highly intertwined with both emotions and context. Not surprisingly, many of the users' information seeking actions aim at retrieving music songs based on these perceptual dimensions - moods and themes, expressing how people feel about music or which situations they associate it with. In order to successfully support music retrieval along these dimensions, powerful methods are needed. Still, most existing approaches aiming at inferring some of the songs' latent characteristics focus on identifying musical genres. In this paper we aim at bridging this gap between users' information needs and indexed music features by developing algorithms for classifying music songs by moods and themes. We extend existing approaches by also considering the songs' thematic dimensions and by using social data from the Last.fm music portal, as support for the classification tasks. Our methods exploit both audio features and collaborative user annotations, fusing them to improve overall performance. Evaluation performed against the AllMusic.com ground truth shows that both kinds of information are complementary and should be merged for enhanced classification accuracy.

ASJC Scopus subject areas

Cite this

Music mood and theme classification: A hybrid approach. / Bischoff, Kerstin; Firan, Claudiu S.; Paiu, Raluca et al.
Proceedings of the 10th International Society for Music Information Retrieval Conference, ISMIR 2009. 2009. p. 657-662 (Proceedings of the 10th International Society for Music Information Retrieval Conference, ISMIR 2009).

Research output: Chapter in book/report/conference proceedingConference contributionResearchpeer review

Bischoff, K, Firan, CS, Paiu, R, Nejdl, W, Laurier, C & Sordo, M 2009, Music mood and theme classification: A hybrid approach. in Proceedings of the 10th International Society for Music Information Retrieval Conference, ISMIR 2009. Proceedings of the 10th International Society for Music Information Retrieval Conference, ISMIR 2009, pp. 657-662, 10th International Society for Music Information Retrieval Conference, ISMIR 2009, Kobe, Japan, 26 Oct 2009.
Bischoff, K., Firan, C. S., Paiu, R., Nejdl, W., Laurier, C., & Sordo, M. (2009). Music mood and theme classification: A hybrid approach. In Proceedings of the 10th International Society for Music Information Retrieval Conference, ISMIR 2009 (pp. 657-662). (Proceedings of the 10th International Society for Music Information Retrieval Conference, ISMIR 2009).
Bischoff K, Firan CS, Paiu R, Nejdl W, Laurier C, Sordo M. Music mood and theme classification: A hybrid approach. In Proceedings of the 10th International Society for Music Information Retrieval Conference, ISMIR 2009. 2009. p. 657-662. (Proceedings of the 10th International Society for Music Information Retrieval Conference, ISMIR 2009).
Bischoff, Kerstin ; Firan, Claudiu S. ; Paiu, Raluca et al. / Music mood and theme classification : A hybrid approach. Proceedings of the 10th International Society for Music Information Retrieval Conference, ISMIR 2009. 2009. pp. 657-662 (Proceedings of the 10th International Society for Music Information Retrieval Conference, ISMIR 2009).
Download
@inproceedings{54b4d0b158554a80b4641bea12244d9a,
title = "Music mood and theme classification: A hybrid approach",
abstract = "Music perception is highly intertwined with both emotions and context. Not surprisingly, many of the users' information seeking actions aim at retrieving music songs based on these perceptual dimensions - moods and themes, expressing how people feel about music or which situations they associate it with. In order to successfully support music retrieval along these dimensions, powerful methods are needed. Still, most existing approaches aiming at inferring some of the songs' latent characteristics focus on identifying musical genres. In this paper we aim at bridging this gap between users' information needs and indexed music features by developing algorithms for classifying music songs by moods and themes. We extend existing approaches by also considering the songs' thematic dimensions and by using social data from the Last.fm music portal, as support for the classification tasks. Our methods exploit both audio features and collaborative user annotations, fusing them to improve overall performance. Evaluation performed against the AllMusic.com ground truth shows that both kinds of information are complementary and should be merged for enhanced classification accuracy.",
author = "Kerstin Bischoff and Firan, {Claudiu S.} and Raluca Paiu and Wolfgang Nejdl and Cyril Laurier and Mohamed Sordo",
year = "2009",
language = "English",
isbn = "9780981353708",
series = "Proceedings of the 10th International Society for Music Information Retrieval Conference, ISMIR 2009",
pages = "657--662",
booktitle = "Proceedings of the 10th International Society for Music Information Retrieval Conference, ISMIR 2009",
note = "10th International Society for Music Information Retrieval Conference, ISMIR 2009 ; Conference date: 26-10-2009 Through 30-10-2009",

}

Download

TY - GEN

T1 - Music mood and theme classification

T2 - 10th International Society for Music Information Retrieval Conference, ISMIR 2009

AU - Bischoff, Kerstin

AU - Firan, Claudiu S.

AU - Paiu, Raluca

AU - Nejdl, Wolfgang

AU - Laurier, Cyril

AU - Sordo, Mohamed

PY - 2009

Y1 - 2009

N2 - Music perception is highly intertwined with both emotions and context. Not surprisingly, many of the users' information seeking actions aim at retrieving music songs based on these perceptual dimensions - moods and themes, expressing how people feel about music or which situations they associate it with. In order to successfully support music retrieval along these dimensions, powerful methods are needed. Still, most existing approaches aiming at inferring some of the songs' latent characteristics focus on identifying musical genres. In this paper we aim at bridging this gap between users' information needs and indexed music features by developing algorithms for classifying music songs by moods and themes. We extend existing approaches by also considering the songs' thematic dimensions and by using social data from the Last.fm music portal, as support for the classification tasks. Our methods exploit both audio features and collaborative user annotations, fusing them to improve overall performance. Evaluation performed against the AllMusic.com ground truth shows that both kinds of information are complementary and should be merged for enhanced classification accuracy.

AB - Music perception is highly intertwined with both emotions and context. Not surprisingly, many of the users' information seeking actions aim at retrieving music songs based on these perceptual dimensions - moods and themes, expressing how people feel about music or which situations they associate it with. In order to successfully support music retrieval along these dimensions, powerful methods are needed. Still, most existing approaches aiming at inferring some of the songs' latent characteristics focus on identifying musical genres. In this paper we aim at bridging this gap between users' information needs and indexed music features by developing algorithms for classifying music songs by moods and themes. We extend existing approaches by also considering the songs' thematic dimensions and by using social data from the Last.fm music portal, as support for the classification tasks. Our methods exploit both audio features and collaborative user annotations, fusing them to improve overall performance. Evaluation performed against the AllMusic.com ground truth shows that both kinds of information are complementary and should be merged for enhanced classification accuracy.

UR - http://www.scopus.com/inward/record.url?scp=84862927717&partnerID=8YFLogxK

M3 - Conference contribution

AN - SCOPUS:84862927717

SN - 9780981353708

T3 - Proceedings of the 10th International Society for Music Information Retrieval Conference, ISMIR 2009

SP - 657

EP - 662

BT - Proceedings of the 10th International Society for Music Information Retrieval Conference, ISMIR 2009

Y2 - 26 October 2009 through 30 October 2009

ER -

By the same author(s)