Details
Original language | English |
---|---|
Title of host publication | Proceedings of the 22nd Conference on Computational Natural Language Learning |
Place of Publication | Brüssel |
Pages | 31-41 |
Publication status | E-pub ahead of print - Oct 2018 |
Event | Conference on Computational Natural Language Learning - Brüssel, Belgium Duration: 31 Oct 2018 → 1 Nov 2018 Conference number: 22 |
Abstract
Keywords
- cs.IR, cs.CL, cs.LG, stat.ML
Cite this
- Standard
- Harvard
- Apa
- Vancouver
- BibTeX
- RIS
Proceedings of the 22nd Conference on Computational Natural Language Learning. Brüssel, 2018. p. 31-41.
Research output: Chapter in book/report/conference proceeding › Conference contribution › Research › peer review
Computational Natural Language Learning, Brüssel, Belgium, 31 Oct 2018. https://doi.org/10.18653/v1/K18-1004
}
TY - GEN
T1 - A Trio Neural Model for Dynamic Entity Relatedness Ranking
AU - Nguyen, Tu Ngoc
AU - Tran, Tuan
AU - Nejdl, Wolfgang
N1 - Conference code: 22
PY - 2018/10
Y1 - 2018/10
N2 - Measuring entity relatedness is a fundamental task for many natural language processing and information retrieval applications. Prior work often studies entity relatedness in static settings and an unsupervised manner. However, entities in real-world are often involved in many different relationships, consequently entity-relations are very dynamic over time. In this work, we propose a neural networkbased approach for dynamic entity relatedness, leveraging the collective attention as supervision. Our model is capable of learning rich and different entity representations in a joint framework. Through extensive experiments on large-scale datasets, we demonstrate that our method achieves better results than competitive baselines.
AB - Measuring entity relatedness is a fundamental task for many natural language processing and information retrieval applications. Prior work often studies entity relatedness in static settings and an unsupervised manner. However, entities in real-world are often involved in many different relationships, consequently entity-relations are very dynamic over time. In this work, we propose a neural networkbased approach for dynamic entity relatedness, leveraging the collective attention as supervision. Our model is capable of learning rich and different entity representations in a joint framework. Through extensive experiments on large-scale datasets, we demonstrate that our method achieves better results than competitive baselines.
KW - cs.IR
KW - cs.CL
KW - cs.LG
KW - stat.ML
U2 - 10.18653/v1/K18-1004
DO - 10.18653/v1/K18-1004
M3 - Conference contribution
SP - 31
EP - 41
BT - Proceedings of the 22nd Conference on Computational Natural Language Learning
CY - Brüssel
T2 - Conference on<br/>Computational Natural Language Learning
Y2 - 31 October 2018 through 1 November 2018
ER -