How do you feel about "Dancing Queen"? Deriving mood & theme annotations from user tags

Publikation: Beitrag in Buch/Bericht/Sammelwerk/KonferenzbandAufsatz in KonferenzbandForschungPeer-Review

Autoren

Organisationseinheiten

Forschungs-netzwerk anzeigen

Details

OriginalspracheEnglisch
Titel des SammelwerksJCDL'09
UntertitelProceedings of the 2009 ACM/IEEE Joint Conference on Digital Libraries
Herausgeber (Verlag)Association for Computing Machinery (ACM)
Seiten285-294
Seitenumfang10
ISBN (Print)9781605586977
PublikationsstatusVeröffentlicht - 15 Juni 2009
Veranstaltung2009 ACM/IEEE Joint Conference on Digital Libraries, JCDL'09 - Austin, TX, USA / Vereinigte Staaten
Dauer: 15 Juni 200919 Juni 2009

Publikationsreihe

NameProceedings of the ACM/IEEE Joint Conference on Digital Libraries
ISSN (Print)1552-5996

Abstract

Web 2.0 enables information sharing, collaboration among users and most notably supports active participation and creativity of the users. As a result, a huge amount of manually created metadata describing all kinds of resources is now available. Such semantically rich user generated annotations are especially valuable for digital libraries covering multimedia resources such as music, where these metadata enable retrieval relying not only on content-based (low level) features, but also on the textual descriptions represented by tags. However, if we analyze the annotations users generate for music tracks, we find them heavily biased towards genre. Previous work investigating the types of user provided annotations for music tracks showed that the types of tags which would be really beneficial for supporting retrieval - usage (theme) and opinion (mood) tags - are often neglected by users in the annotation process. In this paper we address exactly this problem: in order to support users in tagging and to fill these gaps in the tag space, we develop algorithms for recommending mood and theme annotations. Our methods exploit the available user annotations, the lyrics of music tracks, as well as combinations of both. We also compare the results for our recommended mood / theme annotations against genre and style recommendations - a much easier and already studied task. Besides evaluating against an expert (AllMusic.com) ground truth, we evaluate the quality of our recommended tags through a Facebook-based user study. Our results are very promising both in comparison to experts as well as users and provide interesting insights into possible extensions for music tagging systems to support music search.

ASJC Scopus Sachgebiete

Zitieren

How do you feel about "Dancing Queen"? Deriving mood & theme annotations from user tags. / Bischoff, Kerstin; Firan, Claudiu S.; Nejdl, Wolfgang et al.
JCDL'09: Proceedings of the 2009 ACM/IEEE Joint Conference on Digital Libraries. Association for Computing Machinery (ACM), 2009. S. 285-294 (Proceedings of the ACM/IEEE Joint Conference on Digital Libraries).

Publikation: Beitrag in Buch/Bericht/Sammelwerk/KonferenzbandAufsatz in KonferenzbandForschungPeer-Review

Bischoff, K, Firan, CS, Nejdl, W & Paiu, R 2009, How do you feel about "Dancing Queen"? Deriving mood & theme annotations from user tags. in JCDL'09: Proceedings of the 2009 ACM/IEEE Joint Conference on Digital Libraries. Proceedings of the ACM/IEEE Joint Conference on Digital Libraries, Association for Computing Machinery (ACM), S. 285-294, 2009 ACM/IEEE Joint Conference on Digital Libraries, JCDL'09, Austin, TX, USA / Vereinigte Staaten, 15 Juni 2009. https://doi.org/10.1145/1555400.1555448
Bischoff, K., Firan, C. S., Nejdl, W., & Paiu, R. (2009). How do you feel about "Dancing Queen"? Deriving mood & theme annotations from user tags. In JCDL'09: Proceedings of the 2009 ACM/IEEE Joint Conference on Digital Libraries (S. 285-294). (Proceedings of the ACM/IEEE Joint Conference on Digital Libraries). Association for Computing Machinery (ACM). https://doi.org/10.1145/1555400.1555448
Bischoff K, Firan CS, Nejdl W, Paiu R. How do you feel about "Dancing Queen"? Deriving mood & theme annotations from user tags. in JCDL'09: Proceedings of the 2009 ACM/IEEE Joint Conference on Digital Libraries. Association for Computing Machinery (ACM). 2009. S. 285-294. (Proceedings of the ACM/IEEE Joint Conference on Digital Libraries). doi: 10.1145/1555400.1555448
Bischoff, Kerstin ; Firan, Claudiu S. ; Nejdl, Wolfgang et al. / How do you feel about "Dancing Queen"? Deriving mood & theme annotations from user tags. JCDL'09: Proceedings of the 2009 ACM/IEEE Joint Conference on Digital Libraries. Association for Computing Machinery (ACM), 2009. S. 285-294 (Proceedings of the ACM/IEEE Joint Conference on Digital Libraries).
Download
@inproceedings{a3562b29a03246bdb7279d1fcc4e80e0,
title = "How do you feel about {"}Dancing Queen{"}?: Deriving mood & theme annotations from user tags",
abstract = "Web 2.0 enables information sharing, collaboration among users and most notably supports active participation and creativity of the users. As a result, a huge amount of manually created metadata describing all kinds of resources is now available. Such semantically rich user generated annotations are especially valuable for digital libraries covering multimedia resources such as music, where these metadata enable retrieval relying not only on content-based (low level) features, but also on the textual descriptions represented by tags. However, if we analyze the annotations users generate for music tracks, we find them heavily biased towards genre. Previous work investigating the types of user provided annotations for music tracks showed that the types of tags which would be really beneficial for supporting retrieval - usage (theme) and opinion (mood) tags - are often neglected by users in the annotation process. In this paper we address exactly this problem: in order to support users in tagging and to fill these gaps in the tag space, we develop algorithms for recommending mood and theme annotations. Our methods exploit the available user annotations, the lyrics of music tracks, as well as combinations of both. We also compare the results for our recommended mood / theme annotations against genre and style recommendations - a much easier and already studied task. Besides evaluating against an expert (AllMusic.com) ground truth, we evaluate the quality of our recommended tags through a Facebook-based user study. Our results are very promising both in comparison to experts as well as users and provide interesting insights into possible extensions for music tagging systems to support music search.",
keywords = "Algorithms, Experimentation, Human factors, Measurement, Reliability",
author = "Kerstin Bischoff and Firan, {Claudiu S.} and Wolfgang Nejdl and Raluca Paiu",
year = "2009",
month = jun,
day = "15",
doi = "10.1145/1555400.1555448",
language = "English",
isbn = "9781605586977",
series = "Proceedings of the ACM/IEEE Joint Conference on Digital Libraries",
publisher = "Association for Computing Machinery (ACM)",
pages = "285--294",
booktitle = "JCDL'09",
address = "United States",
note = "2009 ACM/IEEE Joint Conference on Digital Libraries, JCDL'09 ; Conference date: 15-06-2009 Through 19-06-2009",

}

Download

TY - GEN

T1 - How do you feel about "Dancing Queen"?

T2 - 2009 ACM/IEEE Joint Conference on Digital Libraries, JCDL'09

AU - Bischoff, Kerstin

AU - Firan, Claudiu S.

AU - Nejdl, Wolfgang

AU - Paiu, Raluca

PY - 2009/6/15

Y1 - 2009/6/15

N2 - Web 2.0 enables information sharing, collaboration among users and most notably supports active participation and creativity of the users. As a result, a huge amount of manually created metadata describing all kinds of resources is now available. Such semantically rich user generated annotations are especially valuable for digital libraries covering multimedia resources such as music, where these metadata enable retrieval relying not only on content-based (low level) features, but also on the textual descriptions represented by tags. However, if we analyze the annotations users generate for music tracks, we find them heavily biased towards genre. Previous work investigating the types of user provided annotations for music tracks showed that the types of tags which would be really beneficial for supporting retrieval - usage (theme) and opinion (mood) tags - are often neglected by users in the annotation process. In this paper we address exactly this problem: in order to support users in tagging and to fill these gaps in the tag space, we develop algorithms for recommending mood and theme annotations. Our methods exploit the available user annotations, the lyrics of music tracks, as well as combinations of both. We also compare the results for our recommended mood / theme annotations against genre and style recommendations - a much easier and already studied task. Besides evaluating against an expert (AllMusic.com) ground truth, we evaluate the quality of our recommended tags through a Facebook-based user study. Our results are very promising both in comparison to experts as well as users and provide interesting insights into possible extensions for music tagging systems to support music search.

AB - Web 2.0 enables information sharing, collaboration among users and most notably supports active participation and creativity of the users. As a result, a huge amount of manually created metadata describing all kinds of resources is now available. Such semantically rich user generated annotations are especially valuable for digital libraries covering multimedia resources such as music, where these metadata enable retrieval relying not only on content-based (low level) features, but also on the textual descriptions represented by tags. However, if we analyze the annotations users generate for music tracks, we find them heavily biased towards genre. Previous work investigating the types of user provided annotations for music tracks showed that the types of tags which would be really beneficial for supporting retrieval - usage (theme) and opinion (mood) tags - are often neglected by users in the annotation process. In this paper we address exactly this problem: in order to support users in tagging and to fill these gaps in the tag space, we develop algorithms for recommending mood and theme annotations. Our methods exploit the available user annotations, the lyrics of music tracks, as well as combinations of both. We also compare the results for our recommended mood / theme annotations against genre and style recommendations - a much easier and already studied task. Besides evaluating against an expert (AllMusic.com) ground truth, we evaluate the quality of our recommended tags through a Facebook-based user study. Our results are very promising both in comparison to experts as well as users and provide interesting insights into possible extensions for music tagging systems to support music search.

KW - Algorithms

KW - Experimentation

KW - Human factors

KW - Measurement

KW - Reliability

UR - http://www.scopus.com/inward/record.url?scp=70450251973&partnerID=8YFLogxK

U2 - 10.1145/1555400.1555448

DO - 10.1145/1555400.1555448

M3 - Conference contribution

AN - SCOPUS:70450251973

SN - 9781605586977

T3 - Proceedings of the ACM/IEEE Joint Conference on Digital Libraries

SP - 285

EP - 294

BT - JCDL'09

PB - Association for Computing Machinery (ACM)

Y2 - 15 June 2009 through 19 June 2009

ER -

Von denselben Autoren