A Trio Neural Model for Dynamic Entity Relatedness Ranking

Research output: Chapter in book/report/conference proceedingConference contributionResearchpeer review

Authors

Research Organisations

External Research Organisations

  • Robert Bosch GmbH
View graph of relations

Details

Original languageEnglish
Title of host publicationProceedings of the 22nd Conference on Computational Natural Language Learning
Place of PublicationBrüssel
Pages31-41
Publication statusE-pub ahead of print - Oct 2018
EventConference on
Computational Natural Language Learning
- Brüssel, Belgium
Duration: 31 Oct 20181 Nov 2018
Conference number: 22

Abstract

Measuring entity relatedness is a fundamental task for many natural language processing and information retrieval applications. Prior work often studies entity relatedness in static settings and an unsupervised manner. However, entities in real-world are often involved in many different relationships, consequently entity-relations are very dynamic over time. In this work, we propose a neural networkbased approach for dynamic entity relatedness, leveraging the collective attention as supervision. Our model is capable of learning rich and different entity representations in a joint framework. Through extensive experiments on large-scale datasets, we demonstrate that our method achieves better results than competitive baselines.

Keywords

    cs.IR, cs.CL, cs.LG, stat.ML

Cite this

A Trio Neural Model for Dynamic Entity Relatedness Ranking. / Nguyen, Tu Ngoc; Tran, Tuan; Nejdl, Wolfgang.
Proceedings of the 22nd Conference on Computational Natural Language Learning. Brüssel, 2018. p. 31-41.

Research output: Chapter in book/report/conference proceedingConference contributionResearchpeer review

Nguyen, TN, Tran, T & Nejdl, W 2018, A Trio Neural Model for Dynamic Entity Relatedness Ranking. in Proceedings of the 22nd Conference on Computational Natural Language Learning. Brüssel, pp. 31-41, Conference on
Computational Natural Language Learning, Brüssel, Belgium, 31 Oct 2018. https://doi.org/10.18653/v1/K18-1004
Nguyen, T. N., Tran, T., & Nejdl, W. (2018). A Trio Neural Model for Dynamic Entity Relatedness Ranking. In Proceedings of the 22nd Conference on Computational Natural Language Learning (pp. 31-41). Advance online publication. https://doi.org/10.18653/v1/K18-1004
Nguyen TN, Tran T, Nejdl W. A Trio Neural Model for Dynamic Entity Relatedness Ranking. In Proceedings of the 22nd Conference on Computational Natural Language Learning. Brüssel. 2018. p. 31-41 Epub 2018 Oct. doi: 10.18653/v1/K18-1004
Nguyen, Tu Ngoc ; Tran, Tuan ; Nejdl, Wolfgang. / A Trio Neural Model for Dynamic Entity Relatedness Ranking. Proceedings of the 22nd Conference on Computational Natural Language Learning. Brüssel, 2018. pp. 31-41
Download
@inproceedings{bfd9ae9786544f4d91b1da4297f3d40e,
title = "A Trio Neural Model for Dynamic Entity Relatedness Ranking",
abstract = " Measuring entity relatedness is a fundamental task for many natural language processing and information retrieval applications. Prior work often studies entity relatedness in static settings and an unsupervised manner. However, entities in real-world are often involved in many different relationships, consequently entity-relations are very dynamic over time. In this work, we propose a neural networkbased approach for dynamic entity relatedness, leveraging the collective attention as supervision. Our model is capable of learning rich and different entity representations in a joint framework. Through extensive experiments on large-scale datasets, we demonstrate that our method achieves better results than competitive baselines. ",
keywords = "cs.IR, cs.CL, cs.LG, stat.ML",
author = "Nguyen, {Tu Ngoc} and Tuan Tran and Wolfgang Nejdl",
note = "In Proceedings of CoNLL 2018; Conference on<br/>Computational Natural Language Learning ; Conference date: 31-10-2018 Through 01-11-2018",
year = "2018",
month = oct,
doi = "10.18653/v1/K18-1004",
language = "English",
pages = "31--41",
booktitle = "Proceedings of the 22nd Conference on Computational Natural Language Learning",

}

Download

TY - GEN

T1 - A Trio Neural Model for Dynamic Entity Relatedness Ranking

AU - Nguyen, Tu Ngoc

AU - Tran, Tuan

AU - Nejdl, Wolfgang

N1 - Conference code: 22

PY - 2018/10

Y1 - 2018/10

N2 - Measuring entity relatedness is a fundamental task for many natural language processing and information retrieval applications. Prior work often studies entity relatedness in static settings and an unsupervised manner. However, entities in real-world are often involved in many different relationships, consequently entity-relations are very dynamic over time. In this work, we propose a neural networkbased approach for dynamic entity relatedness, leveraging the collective attention as supervision. Our model is capable of learning rich and different entity representations in a joint framework. Through extensive experiments on large-scale datasets, we demonstrate that our method achieves better results than competitive baselines.

AB - Measuring entity relatedness is a fundamental task for many natural language processing and information retrieval applications. Prior work often studies entity relatedness in static settings and an unsupervised manner. However, entities in real-world are often involved in many different relationships, consequently entity-relations are very dynamic over time. In this work, we propose a neural networkbased approach for dynamic entity relatedness, leveraging the collective attention as supervision. Our model is capable of learning rich and different entity representations in a joint framework. Through extensive experiments on large-scale datasets, we demonstrate that our method achieves better results than competitive baselines.

KW - cs.IR

KW - cs.CL

KW - cs.LG

KW - stat.ML

U2 - 10.18653/v1/K18-1004

DO - 10.18653/v1/K18-1004

M3 - Conference contribution

SP - 31

EP - 41

BT - Proceedings of the 22nd Conference on Computational Natural Language Learning

CY - Brüssel

T2 - Conference on<br/>Computational Natural Language Learning

Y2 - 31 October 2018 through 1 November 2018

ER -

By the same author(s)