Learning to rank for joy

Publikation: Beitrag in Buch/Bericht/Sammelwerk/KonferenzbandAufsatz in KonferenzbandForschungPeer-Review

Autoren

  • Claudia Orellana-Rodriguez
  • Wolfgang Nejdl
  • Ernesto Diaz-Aviles
  • Ismail Sengor Altingovde

Organisationseinheiten

Externe Organisationen

  • IBM Research
  • Orta Dogu Technical University
Forschungs-netzwerk anzeigen

Details

OriginalspracheEnglisch
Titel des SammelwerksWWW 2014 Companion - Proceedings of the 23rd International Conference on World Wide Web
Seiten569-570
Seitenumfang2
ISBN (elektronisch)9781450327459
PublikationsstatusVeröffentlicht - 7 Apr. 2014
Veranstaltung23rd International Conference on World Wide Web, WWW 2014 - Seoul, Südkorea
Dauer: 7 Apr. 201411 Apr. 2014

Abstract

User-generated content is a growing source of valuable infor- mation and its analysis can lead to a better understanding of the users needs and trends. In this paper, we leverage user feedback about YouTube videos for the task of affec- Tive video ranking. To this end, we follow a learning to rank approach, which allows us to compare the performance of different sets of features when the ranking task goes beyond mere relevance and requires an affective understanding of the videos. Our results show that, while basic video fea- Tures, such as title and tags, lead to effective rankings in an affective-less setup, they do not perform as good when dealing with an affective ranking task.

ASJC Scopus Sachgebiete

Zitieren

Learning to rank for joy. / Orellana-Rodriguez, Claudia; Nejdl, Wolfgang; Diaz-Aviles, Ernesto et al.
WWW 2014 Companion - Proceedings of the 23rd International Conference on World Wide Web. 2014. S. 569-570.

Publikation: Beitrag in Buch/Bericht/Sammelwerk/KonferenzbandAufsatz in KonferenzbandForschungPeer-Review

Orellana-Rodriguez, C, Nejdl, W, Diaz-Aviles, E & Altingovde, IS 2014, Learning to rank for joy. in WWW 2014 Companion - Proceedings of the 23rd International Conference on World Wide Web. S. 569-570, 23rd International Conference on World Wide Web, WWW 2014, Seoul, Südkorea, 7 Apr. 2014. https://doi.org/10.1145/2567948.2576961
Orellana-Rodriguez, C., Nejdl, W., Diaz-Aviles, E., & Altingovde, I. S. (2014). Learning to rank for joy. In WWW 2014 Companion - Proceedings of the 23rd International Conference on World Wide Web (S. 569-570) https://doi.org/10.1145/2567948.2576961
Orellana-Rodriguez C, Nejdl W, Diaz-Aviles E, Altingovde IS. Learning to rank for joy. in WWW 2014 Companion - Proceedings of the 23rd International Conference on World Wide Web. 2014. S. 569-570 doi: 10.1145/2567948.2576961
Orellana-Rodriguez, Claudia ; Nejdl, Wolfgang ; Diaz-Aviles, Ernesto et al. / Learning to rank for joy. WWW 2014 Companion - Proceedings of the 23rd International Conference on World Wide Web. 2014. S. 569-570
Download
@inproceedings{6a1fc4c256414529928e90aa752921e7,
title = "Learning to rank for joy",
abstract = "User-generated content is a growing source of valuable infor- mation and its analysis can lead to a better understanding of the users needs and trends. In this paper, we leverage user feedback about YouTube videos for the task of affec- Tive video ranking. To this end, we follow a learning to rank approach, which allows us to compare the performance of different sets of features when the ranking task goes beyond mere relevance and requires an affective understanding of the videos. Our results show that, while basic video fea- Tures, such as title and tags, lead to effective rankings in an affective-less setup, they do not perform as good when dealing with an affective ranking task.",
keywords = "Sentiment analysis, Socialmedia analytics, Youtube",
author = "Claudia Orellana-Rodriguez and Wolfgang Nejdl and Ernesto Diaz-Aviles and Altingovde, {Ismail Sengor}",
year = "2014",
month = apr,
day = "7",
doi = "10.1145/2567948.2576961",
language = "English",
pages = "569--570",
booktitle = "WWW 2014 Companion - Proceedings of the 23rd International Conference on World Wide Web",
note = "23rd International Conference on World Wide Web, WWW 2014 ; Conference date: 07-04-2014 Through 11-04-2014",

}

Download

TY - GEN

T1 - Learning to rank for joy

AU - Orellana-Rodriguez, Claudia

AU - Nejdl, Wolfgang

AU - Diaz-Aviles, Ernesto

AU - Altingovde, Ismail Sengor

PY - 2014/4/7

Y1 - 2014/4/7

N2 - User-generated content is a growing source of valuable infor- mation and its analysis can lead to a better understanding of the users needs and trends. In this paper, we leverage user feedback about YouTube videos for the task of affec- Tive video ranking. To this end, we follow a learning to rank approach, which allows us to compare the performance of different sets of features when the ranking task goes beyond mere relevance and requires an affective understanding of the videos. Our results show that, while basic video fea- Tures, such as title and tags, lead to effective rankings in an affective-less setup, they do not perform as good when dealing with an affective ranking task.

AB - User-generated content is a growing source of valuable infor- mation and its analysis can lead to a better understanding of the users needs and trends. In this paper, we leverage user feedback about YouTube videos for the task of affec- Tive video ranking. To this end, we follow a learning to rank approach, which allows us to compare the performance of different sets of features when the ranking task goes beyond mere relevance and requires an affective understanding of the videos. Our results show that, while basic video fea- Tures, such as title and tags, lead to effective rankings in an affective-less setup, they do not perform as good when dealing with an affective ranking task.

KW - Sentiment analysis

KW - Socialmedia analytics

KW - Youtube

UR - http://www.scopus.com/inward/record.url?scp=84990924172&partnerID=8YFLogxK

U2 - 10.1145/2567948.2576961

DO - 10.1145/2567948.2576961

M3 - Conference contribution

AN - SCOPUS:84990924172

SP - 569

EP - 570

BT - WWW 2014 Companion - Proceedings of the 23rd International Conference on World Wide Web

T2 - 23rd International Conference on World Wide Web, WWW 2014

Y2 - 7 April 2014 through 11 April 2014

ER -

Von denselben Autoren