Details
Original language | English |
---|---|
Title of host publication | JCDL'09 |
Subtitle of host publication | Proceedings of the 2009 ACM/IEEE Joint Conference on Digital Libraries |
Publisher | Association for Computing Machinery (ACM) |
Pages | 285-294 |
Number of pages | 10 |
ISBN (print) | 9781605586977 |
Publication status | Published - 15 Jun 2009 |
Event | 2009 ACM/IEEE Joint Conference on Digital Libraries, JCDL'09 - Austin, TX, United States Duration: 15 Jun 2009 → 19 Jun 2009 |
Publication series
Name | Proceedings of the ACM/IEEE Joint Conference on Digital Libraries |
---|---|
ISSN (Print) | 1552-5996 |
Abstract
Web 2.0 enables information sharing, collaboration among users and most notably supports active participation and creativity of the users. As a result, a huge amount of manually created metadata describing all kinds of resources is now available. Such semantically rich user generated annotations are especially valuable for digital libraries covering multimedia resources such as music, where these metadata enable retrieval relying not only on content-based (low level) features, but also on the textual descriptions represented by tags. However, if we analyze the annotations users generate for music tracks, we find them heavily biased towards genre. Previous work investigating the types of user provided annotations for music tracks showed that the types of tags which would be really beneficial for supporting retrieval - usage (theme) and opinion (mood) tags - are often neglected by users in the annotation process. In this paper we address exactly this problem: in order to support users in tagging and to fill these gaps in the tag space, we develop algorithms for recommending mood and theme annotations. Our methods exploit the available user annotations, the lyrics of music tracks, as well as combinations of both. We also compare the results for our recommended mood / theme annotations against genre and style recommendations - a much easier and already studied task. Besides evaluating against an expert (AllMusic.com) ground truth, we evaluate the quality of our recommended tags through a Facebook-based user study. Our results are very promising both in comparison to experts as well as users and provide interesting insights into possible extensions for music tagging systems to support music search.
Keywords
- Algorithms, Experimentation, Human factors, Measurement, Reliability
ASJC Scopus subject areas
- Engineering(all)
- General Engineering
Cite this
- Standard
- Harvard
- Apa
- Vancouver
- BibTeX
- RIS
JCDL'09: Proceedings of the 2009 ACM/IEEE Joint Conference on Digital Libraries. Association for Computing Machinery (ACM), 2009. p. 285-294 (Proceedings of the ACM/IEEE Joint Conference on Digital Libraries).
Research output: Chapter in book/report/conference proceeding › Conference contribution › Research › peer review
}
TY - GEN
T1 - How do you feel about "Dancing Queen"?
T2 - 2009 ACM/IEEE Joint Conference on Digital Libraries, JCDL'09
AU - Bischoff, Kerstin
AU - Firan, Claudiu S.
AU - Nejdl, Wolfgang
AU - Paiu, Raluca
PY - 2009/6/15
Y1 - 2009/6/15
N2 - Web 2.0 enables information sharing, collaboration among users and most notably supports active participation and creativity of the users. As a result, a huge amount of manually created metadata describing all kinds of resources is now available. Such semantically rich user generated annotations are especially valuable for digital libraries covering multimedia resources such as music, where these metadata enable retrieval relying not only on content-based (low level) features, but also on the textual descriptions represented by tags. However, if we analyze the annotations users generate for music tracks, we find them heavily biased towards genre. Previous work investigating the types of user provided annotations for music tracks showed that the types of tags which would be really beneficial for supporting retrieval - usage (theme) and opinion (mood) tags - are often neglected by users in the annotation process. In this paper we address exactly this problem: in order to support users in tagging and to fill these gaps in the tag space, we develop algorithms for recommending mood and theme annotations. Our methods exploit the available user annotations, the lyrics of music tracks, as well as combinations of both. We also compare the results for our recommended mood / theme annotations against genre and style recommendations - a much easier and already studied task. Besides evaluating against an expert (AllMusic.com) ground truth, we evaluate the quality of our recommended tags through a Facebook-based user study. Our results are very promising both in comparison to experts as well as users and provide interesting insights into possible extensions for music tagging systems to support music search.
AB - Web 2.0 enables information sharing, collaboration among users and most notably supports active participation and creativity of the users. As a result, a huge amount of manually created metadata describing all kinds of resources is now available. Such semantically rich user generated annotations are especially valuable for digital libraries covering multimedia resources such as music, where these metadata enable retrieval relying not only on content-based (low level) features, but also on the textual descriptions represented by tags. However, if we analyze the annotations users generate for music tracks, we find them heavily biased towards genre. Previous work investigating the types of user provided annotations for music tracks showed that the types of tags which would be really beneficial for supporting retrieval - usage (theme) and opinion (mood) tags - are often neglected by users in the annotation process. In this paper we address exactly this problem: in order to support users in tagging and to fill these gaps in the tag space, we develop algorithms for recommending mood and theme annotations. Our methods exploit the available user annotations, the lyrics of music tracks, as well as combinations of both. We also compare the results for our recommended mood / theme annotations against genre and style recommendations - a much easier and already studied task. Besides evaluating against an expert (AllMusic.com) ground truth, we evaluate the quality of our recommended tags through a Facebook-based user study. Our results are very promising both in comparison to experts as well as users and provide interesting insights into possible extensions for music tagging systems to support music search.
KW - Algorithms
KW - Experimentation
KW - Human factors
KW - Measurement
KW - Reliability
UR - http://www.scopus.com/inward/record.url?scp=70450251973&partnerID=8YFLogxK
U2 - 10.1145/1555400.1555448
DO - 10.1145/1555400.1555448
M3 - Conference contribution
AN - SCOPUS:70450251973
SN - 9781605586977
T3 - Proceedings of the ACM/IEEE Joint Conference on Digital Libraries
SP - 285
EP - 294
BT - JCDL'09
PB - Association for Computing Machinery (ACM)
Y2 - 15 June 2009 through 19 June 2009
ER -