A Trio Neural Model for Dynamic Entity Relatedness Ranking

Publikation: Beitrag in Buch/Bericht/Sammelwerk/KonferenzbandAufsatz in KonferenzbandForschungPeer-Review

Autoren

Organisationseinheiten

Externe Organisationen

  • Robert Bosch GmbH
Forschungs-netzwerk anzeigen

Details

OriginalspracheEnglisch
Titel des SammelwerksProceedings of the 22nd Conference on Computational Natural Language Learning
ErscheinungsortBrüssel
Seiten31-41
PublikationsstatusElektronisch veröffentlicht (E-Pub) - Okt. 2018
VeranstaltungConference on
Computational Natural Language Learning
- Brüssel, Belgien
Dauer: 31 Okt. 20181 Nov. 2018
Konferenznummer: 22

Abstract

Measuring entity relatedness is a fundamental task for many natural language processing and information retrieval applications. Prior work often studies entity relatedness in static settings and an unsupervised manner. However, entities in real-world are often involved in many different relationships, consequently entity-relations are very dynamic over time. In this work, we propose a neural networkbased approach for dynamic entity relatedness, leveraging the collective attention as supervision. Our model is capable of learning rich and different entity representations in a joint framework. Through extensive experiments on large-scale datasets, we demonstrate that our method achieves better results than competitive baselines.

Zitieren

A Trio Neural Model for Dynamic Entity Relatedness Ranking. / Nguyen, Tu Ngoc; Tran, Tuan; Nejdl, Wolfgang.
Proceedings of the 22nd Conference on Computational Natural Language Learning. Brüssel, 2018. S. 31-41.

Publikation: Beitrag in Buch/Bericht/Sammelwerk/KonferenzbandAufsatz in KonferenzbandForschungPeer-Review

Nguyen, TN, Tran, T & Nejdl, W 2018, A Trio Neural Model for Dynamic Entity Relatedness Ranking. in Proceedings of the 22nd Conference on Computational Natural Language Learning. Brüssel, S. 31-41, Conference on
Computational Natural Language Learning, Brüssel, Belgien, 31 Okt. 2018. https://doi.org/10.18653/v1/K18-1004
Nguyen, T. N., Tran, T., & Nejdl, W. (2018). A Trio Neural Model for Dynamic Entity Relatedness Ranking. In Proceedings of the 22nd Conference on Computational Natural Language Learning (S. 31-41). Vorabveröffentlichung online. https://doi.org/10.18653/v1/K18-1004
Nguyen TN, Tran T, Nejdl W. A Trio Neural Model for Dynamic Entity Relatedness Ranking. in Proceedings of the 22nd Conference on Computational Natural Language Learning. Brüssel. 2018. S. 31-41 Epub 2018 Okt. doi: 10.18653/v1/K18-1004
Nguyen, Tu Ngoc ; Tran, Tuan ; Nejdl, Wolfgang. / A Trio Neural Model for Dynamic Entity Relatedness Ranking. Proceedings of the 22nd Conference on Computational Natural Language Learning. Brüssel, 2018. S. 31-41
Download
@inproceedings{bfd9ae9786544f4d91b1da4297f3d40e,
title = "A Trio Neural Model for Dynamic Entity Relatedness Ranking",
abstract = " Measuring entity relatedness is a fundamental task for many natural language processing and information retrieval applications. Prior work often studies entity relatedness in static settings and an unsupervised manner. However, entities in real-world are often involved in many different relationships, consequently entity-relations are very dynamic over time. In this work, we propose a neural networkbased approach for dynamic entity relatedness, leveraging the collective attention as supervision. Our model is capable of learning rich and different entity representations in a joint framework. Through extensive experiments on large-scale datasets, we demonstrate that our method achieves better results than competitive baselines. ",
keywords = "cs.IR, cs.CL, cs.LG, stat.ML",
author = "Nguyen, {Tu Ngoc} and Tuan Tran and Wolfgang Nejdl",
note = "In Proceedings of CoNLL 2018; Conference on<br/>Computational Natural Language Learning ; Conference date: 31-10-2018 Through 01-11-2018",
year = "2018",
month = oct,
doi = "10.18653/v1/K18-1004",
language = "English",
pages = "31--41",
booktitle = "Proceedings of the 22nd Conference on Computational Natural Language Learning",

}

Download

TY - GEN

T1 - A Trio Neural Model for Dynamic Entity Relatedness Ranking

AU - Nguyen, Tu Ngoc

AU - Tran, Tuan

AU - Nejdl, Wolfgang

N1 - Conference code: 22

PY - 2018/10

Y1 - 2018/10

N2 - Measuring entity relatedness is a fundamental task for many natural language processing and information retrieval applications. Prior work often studies entity relatedness in static settings and an unsupervised manner. However, entities in real-world are often involved in many different relationships, consequently entity-relations are very dynamic over time. In this work, we propose a neural networkbased approach for dynamic entity relatedness, leveraging the collective attention as supervision. Our model is capable of learning rich and different entity representations in a joint framework. Through extensive experiments on large-scale datasets, we demonstrate that our method achieves better results than competitive baselines.

AB - Measuring entity relatedness is a fundamental task for many natural language processing and information retrieval applications. Prior work often studies entity relatedness in static settings and an unsupervised manner. However, entities in real-world are often involved in many different relationships, consequently entity-relations are very dynamic over time. In this work, we propose a neural networkbased approach for dynamic entity relatedness, leveraging the collective attention as supervision. Our model is capable of learning rich and different entity representations in a joint framework. Through extensive experiments on large-scale datasets, we demonstrate that our method achieves better results than competitive baselines.

KW - cs.IR

KW - cs.CL

KW - cs.LG

KW - stat.ML

U2 - 10.18653/v1/K18-1004

DO - 10.18653/v1/K18-1004

M3 - Conference contribution

SP - 31

EP - 41

BT - Proceedings of the 22nd Conference on Computational Natural Language Learning

CY - Brüssel

T2 - Conference on<br/>Computational Natural Language Learning

Y2 - 31 October 2018 through 1 November 2018

ER -

Von denselben Autoren