Details
Originalsprache | Englisch |
---|---|
Titel des Sammelwerks | K-CAP 2021 |
Untertitel | Proceedings of the 11th Knowledge Capture Conference |
Erscheinungsort | New York |
Seiten | 225-232 |
Seitenumfang | 8 |
ISBN (elektronisch) | 9781450384575 |
Publikationsstatus | Veröffentlicht - 2 Dez. 2021 |
Veranstaltung | 11th ACM International Conference on Knowledge Capture, K-CAP 2021 - Virtual, Online, USA / Vereinigte Staaten Dauer: 2 Dez. 2021 → 3 Dez. 2021 |
Abstract
structured information representing knowledge encoded in scientific publications. With the sheer volume of published scientific literature comprising a plethora of inhomogeneous entities and relations to describe scientific concepts, these KGs are inherently incomplete. We present exBERT, a method for leveraging pre-trained transformer language models to perform scholarly knowledge graph completion. We model triples of a knowledge graph as text and perform triple classification (i.e., belongs to KG or not). The evaluation shows that exBERT outperforms other baselines on three scholarly KG completion datasets in the tasks of triple classification, link prediction, and relation prediction. Furthermore, we present two scholarly datasets as resources for the research community, collected from public KGs and online resources.
ASJC Scopus Sachgebiete
- Informatik (insg.)
- Information systems
- Informatik (insg.)
- Software
Zitieren
- Standard
- Harvard
- Apa
- Vancouver
- BibTex
- RIS
K-CAP 2021: Proceedings of the 11th Knowledge Capture Conference. New York, 2021. S. 225-232.
Publikation: Beitrag in Buch/Bericht/Sammelwerk/Konferenzband › Aufsatz in Konferenzband › Forschung › Peer-Review
}
TY - GEN
T1 - Triple Classification for Scholarly Knowledge Graph Completion
AU - Jaradeh, Mohamad Yaser
AU - Singh, Kuldeep
AU - Stocker, Markus
AU - Auer, Sören
N1 - Funding Information: This work was co-funded by the European Research Council for the project ScienceGRAPH (Grant agreement ID: 819536) and the TIB Leibniz Information Centre for Science and Technology. We thank Oliver Karras and Allard Oelen for their valuable feedback.
PY - 2021/12/2
Y1 - 2021/12/2
N2 - structured information representing knowledge encoded in scientific publications. With the sheer volume of published scientific literature comprising a plethora of inhomogeneous entities and relations to describe scientific concepts, these KGs are inherently incomplete. We present exBERT, a method for leveraging pre-trained transformer language models to perform scholarly knowledge graph completion. We model triples of a knowledge graph as text and perform triple classification (i.e., belongs to KG or not). The evaluation shows that exBERT outperforms other baselines on three scholarly KG completion datasets in the tasks of triple classification, link prediction, and relation prediction. Furthermore, we present two scholarly datasets as resources for the research community, collected from public KGs and online resources.
AB - structured information representing knowledge encoded in scientific publications. With the sheer volume of published scientific literature comprising a plethora of inhomogeneous entities and relations to describe scientific concepts, these KGs are inherently incomplete. We present exBERT, a method for leveraging pre-trained transformer language models to perform scholarly knowledge graph completion. We model triples of a knowledge graph as text and perform triple classification (i.e., belongs to KG or not). The evaluation shows that exBERT outperforms other baselines on three scholarly KG completion datasets in the tasks of triple classification, link prediction, and relation prediction. Furthermore, we present two scholarly datasets as resources for the research community, collected from public KGs and online resources.
KW - link prediction
KW - relation prediction
KW - scholarly knowledge graphs
KW - triple classification
UR - http://www.scopus.com/inward/record.url?scp=85120857972&partnerID=8YFLogxK
U2 - 10.1145/3460210.3493582
DO - 10.1145/3460210.3493582
M3 - Conference contribution
AN - SCOPUS:85120857972
SP - 225
EP - 232
BT - K-CAP 2021
CY - New York
T2 - 11th ACM International Conference on Knowledge Capture, K-CAP 2021
Y2 - 2 December 2021 through 3 December 2021
ER -