Details
Originalsprache | Englisch |
---|---|
Titel des Sammelwerks | Proceedings of the 22nd Conference on Computational Natural Language Learning |
Erscheinungsort | Brüssel |
Seiten | 31-41 |
Publikationsstatus | Elektronisch veröffentlicht (E-Pub) - Okt. 2018 |
Veranstaltung | Conference on Computational Natural Language Learning - Brüssel, Belgien Dauer: 31 Okt. 2018 → 1 Nov. 2018 Konferenznummer: 22 |
Abstract
Zitieren
- Standard
- Harvard
- Apa
- Vancouver
- BibTex
- RIS
Proceedings of the 22nd Conference on Computational Natural Language Learning. Brüssel, 2018. S. 31-41.
Publikation: Beitrag in Buch/Bericht/Sammelwerk/Konferenzband › Aufsatz in Konferenzband › Forschung › Peer-Review
Computational Natural Language Learning, Brüssel, Belgien, 31 Okt. 2018. https://doi.org/10.18653/v1/K18-1004
}
TY - GEN
T1 - A Trio Neural Model for Dynamic Entity Relatedness Ranking
AU - Nguyen, Tu Ngoc
AU - Tran, Tuan
AU - Nejdl, Wolfgang
N1 - Conference code: 22
PY - 2018/10
Y1 - 2018/10
N2 - Measuring entity relatedness is a fundamental task for many natural language processing and information retrieval applications. Prior work often studies entity relatedness in static settings and an unsupervised manner. However, entities in real-world are often involved in many different relationships, consequently entity-relations are very dynamic over time. In this work, we propose a neural networkbased approach for dynamic entity relatedness, leveraging the collective attention as supervision. Our model is capable of learning rich and different entity representations in a joint framework. Through extensive experiments on large-scale datasets, we demonstrate that our method achieves better results than competitive baselines.
AB - Measuring entity relatedness is a fundamental task for many natural language processing and information retrieval applications. Prior work often studies entity relatedness in static settings and an unsupervised manner. However, entities in real-world are often involved in many different relationships, consequently entity-relations are very dynamic over time. In this work, we propose a neural networkbased approach for dynamic entity relatedness, leveraging the collective attention as supervision. Our model is capable of learning rich and different entity representations in a joint framework. Through extensive experiments on large-scale datasets, we demonstrate that our method achieves better results than competitive baselines.
KW - cs.IR
KW - cs.CL
KW - cs.LG
KW - stat.ML
U2 - 10.18653/v1/K18-1004
DO - 10.18653/v1/K18-1004
M3 - Conference contribution
SP - 31
EP - 41
BT - Proceedings of the 22nd Conference on Computational Natural Language Learning
CY - Brüssel
T2 - Conference on<br/>Computational Natural Language Learning
Y2 - 31 October 2018 through 1 November 2018
ER -