How do you feel about "Dancing Queen"? Deriving mood & theme annotations from user tags

Research output: Chapter in book/report/conference proceedingConference contributionResearchpeer review

Authors

Research Organisations

View graph of relations

Details

Original languageEnglish
Title of host publicationJCDL'09
Subtitle of host publicationProceedings of the 2009 ACM/IEEE Joint Conference on Digital Libraries
PublisherAssociation for Computing Machinery (ACM)
Pages285-294
Number of pages10
ISBN (print)9781605586977
Publication statusPublished - 15 Jun 2009
Event2009 ACM/IEEE Joint Conference on Digital Libraries, JCDL'09 - Austin, TX, United States
Duration: 15 Jun 200919 Jun 2009

Publication series

NameProceedings of the ACM/IEEE Joint Conference on Digital Libraries
ISSN (Print)1552-5996

Abstract

Web 2.0 enables information sharing, collaboration among users and most notably supports active participation and creativity of the users. As a result, a huge amount of manually created metadata describing all kinds of resources is now available. Such semantically rich user generated annotations are especially valuable for digital libraries covering multimedia resources such as music, where these metadata enable retrieval relying not only on content-based (low level) features, but also on the textual descriptions represented by tags. However, if we analyze the annotations users generate for music tracks, we find them heavily biased towards genre. Previous work investigating the types of user provided annotations for music tracks showed that the types of tags which would be really beneficial for supporting retrieval - usage (theme) and opinion (mood) tags - are often neglected by users in the annotation process. In this paper we address exactly this problem: in order to support users in tagging and to fill these gaps in the tag space, we develop algorithms for recommending mood and theme annotations. Our methods exploit the available user annotations, the lyrics of music tracks, as well as combinations of both. We also compare the results for our recommended mood / theme annotations against genre and style recommendations - a much easier and already studied task. Besides evaluating against an expert (AllMusic.com) ground truth, we evaluate the quality of our recommended tags through a Facebook-based user study. Our results are very promising both in comparison to experts as well as users and provide interesting insights into possible extensions for music tagging systems to support music search.

Keywords

    Algorithms, Experimentation, Human factors, Measurement, Reliability

ASJC Scopus subject areas

Cite this

How do you feel about "Dancing Queen"? Deriving mood & theme annotations from user tags. / Bischoff, Kerstin; Firan, Claudiu S.; Nejdl, Wolfgang et al.
JCDL'09: Proceedings of the 2009 ACM/IEEE Joint Conference on Digital Libraries. Association for Computing Machinery (ACM), 2009. p. 285-294 (Proceedings of the ACM/IEEE Joint Conference on Digital Libraries).

Research output: Chapter in book/report/conference proceedingConference contributionResearchpeer review

Bischoff, K, Firan, CS, Nejdl, W & Paiu, R 2009, How do you feel about "Dancing Queen"? Deriving mood & theme annotations from user tags. in JCDL'09: Proceedings of the 2009 ACM/IEEE Joint Conference on Digital Libraries. Proceedings of the ACM/IEEE Joint Conference on Digital Libraries, Association for Computing Machinery (ACM), pp. 285-294, 2009 ACM/IEEE Joint Conference on Digital Libraries, JCDL'09, Austin, TX, United States, 15 Jun 2009. https://doi.org/10.1145/1555400.1555448
Bischoff, K., Firan, C. S., Nejdl, W., & Paiu, R. (2009). How do you feel about "Dancing Queen"? Deriving mood & theme annotations from user tags. In JCDL'09: Proceedings of the 2009 ACM/IEEE Joint Conference on Digital Libraries (pp. 285-294). (Proceedings of the ACM/IEEE Joint Conference on Digital Libraries). Association for Computing Machinery (ACM). https://doi.org/10.1145/1555400.1555448
Bischoff K, Firan CS, Nejdl W, Paiu R. How do you feel about "Dancing Queen"? Deriving mood & theme annotations from user tags. In JCDL'09: Proceedings of the 2009 ACM/IEEE Joint Conference on Digital Libraries. Association for Computing Machinery (ACM). 2009. p. 285-294. (Proceedings of the ACM/IEEE Joint Conference on Digital Libraries). doi: 10.1145/1555400.1555448
Bischoff, Kerstin ; Firan, Claudiu S. ; Nejdl, Wolfgang et al. / How do you feel about "Dancing Queen"? Deriving mood & theme annotations from user tags. JCDL'09: Proceedings of the 2009 ACM/IEEE Joint Conference on Digital Libraries. Association for Computing Machinery (ACM), 2009. pp. 285-294 (Proceedings of the ACM/IEEE Joint Conference on Digital Libraries).
Download
@inproceedings{a3562b29a03246bdb7279d1fcc4e80e0,
title = "How do you feel about {"}Dancing Queen{"}?: Deriving mood & theme annotations from user tags",
abstract = "Web 2.0 enables information sharing, collaboration among users and most notably supports active participation and creativity of the users. As a result, a huge amount of manually created metadata describing all kinds of resources is now available. Such semantically rich user generated annotations are especially valuable for digital libraries covering multimedia resources such as music, where these metadata enable retrieval relying not only on content-based (low level) features, but also on the textual descriptions represented by tags. However, if we analyze the annotations users generate for music tracks, we find them heavily biased towards genre. Previous work investigating the types of user provided annotations for music tracks showed that the types of tags which would be really beneficial for supporting retrieval - usage (theme) and opinion (mood) tags - are often neglected by users in the annotation process. In this paper we address exactly this problem: in order to support users in tagging and to fill these gaps in the tag space, we develop algorithms for recommending mood and theme annotations. Our methods exploit the available user annotations, the lyrics of music tracks, as well as combinations of both. We also compare the results for our recommended mood / theme annotations against genre and style recommendations - a much easier and already studied task. Besides evaluating against an expert (AllMusic.com) ground truth, we evaluate the quality of our recommended tags through a Facebook-based user study. Our results are very promising both in comparison to experts as well as users and provide interesting insights into possible extensions for music tagging systems to support music search.",
keywords = "Algorithms, Experimentation, Human factors, Measurement, Reliability",
author = "Kerstin Bischoff and Firan, {Claudiu S.} and Wolfgang Nejdl and Raluca Paiu",
year = "2009",
month = jun,
day = "15",
doi = "10.1145/1555400.1555448",
language = "English",
isbn = "9781605586977",
series = "Proceedings of the ACM/IEEE Joint Conference on Digital Libraries",
publisher = "Association for Computing Machinery (ACM)",
pages = "285--294",
booktitle = "JCDL'09",
address = "United States",
note = "2009 ACM/IEEE Joint Conference on Digital Libraries, JCDL'09 ; Conference date: 15-06-2009 Through 19-06-2009",

}

Download

TY - GEN

T1 - How do you feel about "Dancing Queen"?

T2 - 2009 ACM/IEEE Joint Conference on Digital Libraries, JCDL'09

AU - Bischoff, Kerstin

AU - Firan, Claudiu S.

AU - Nejdl, Wolfgang

AU - Paiu, Raluca

PY - 2009/6/15

Y1 - 2009/6/15

N2 - Web 2.0 enables information sharing, collaboration among users and most notably supports active participation and creativity of the users. As a result, a huge amount of manually created metadata describing all kinds of resources is now available. Such semantically rich user generated annotations are especially valuable for digital libraries covering multimedia resources such as music, where these metadata enable retrieval relying not only on content-based (low level) features, but also on the textual descriptions represented by tags. However, if we analyze the annotations users generate for music tracks, we find them heavily biased towards genre. Previous work investigating the types of user provided annotations for music tracks showed that the types of tags which would be really beneficial for supporting retrieval - usage (theme) and opinion (mood) tags - are often neglected by users in the annotation process. In this paper we address exactly this problem: in order to support users in tagging and to fill these gaps in the tag space, we develop algorithms for recommending mood and theme annotations. Our methods exploit the available user annotations, the lyrics of music tracks, as well as combinations of both. We also compare the results for our recommended mood / theme annotations against genre and style recommendations - a much easier and already studied task. Besides evaluating against an expert (AllMusic.com) ground truth, we evaluate the quality of our recommended tags through a Facebook-based user study. Our results are very promising both in comparison to experts as well as users and provide interesting insights into possible extensions for music tagging systems to support music search.

AB - Web 2.0 enables information sharing, collaboration among users and most notably supports active participation and creativity of the users. As a result, a huge amount of manually created metadata describing all kinds of resources is now available. Such semantically rich user generated annotations are especially valuable for digital libraries covering multimedia resources such as music, where these metadata enable retrieval relying not only on content-based (low level) features, but also on the textual descriptions represented by tags. However, if we analyze the annotations users generate for music tracks, we find them heavily biased towards genre. Previous work investigating the types of user provided annotations for music tracks showed that the types of tags which would be really beneficial for supporting retrieval - usage (theme) and opinion (mood) tags - are often neglected by users in the annotation process. In this paper we address exactly this problem: in order to support users in tagging and to fill these gaps in the tag space, we develop algorithms for recommending mood and theme annotations. Our methods exploit the available user annotations, the lyrics of music tracks, as well as combinations of both. We also compare the results for our recommended mood / theme annotations against genre and style recommendations - a much easier and already studied task. Besides evaluating against an expert (AllMusic.com) ground truth, we evaluate the quality of our recommended tags through a Facebook-based user study. Our results are very promising both in comparison to experts as well as users and provide interesting insights into possible extensions for music tagging systems to support music search.

KW - Algorithms

KW - Experimentation

KW - Human factors

KW - Measurement

KW - Reliability

UR - http://www.scopus.com/inward/record.url?scp=70450251973&partnerID=8YFLogxK

U2 - 10.1145/1555400.1555448

DO - 10.1145/1555400.1555448

M3 - Conference contribution

AN - SCOPUS:70450251973

SN - 9781605586977

T3 - Proceedings of the ACM/IEEE Joint Conference on Digital Libraries

SP - 285

EP - 294

BT - JCDL'09

PB - Association for Computing Machinery (ACM)

Y2 - 15 June 2009 through 19 June 2009

ER -

By the same author(s)