Details
Original language | English |
---|---|
Title of host publication | Database and Expert Systems Applications |
Editors | Christine Strauss, Toshiyuki Amagasa, Gabriele Kotsis, Ismail Khalil, A Min Tjoa |
Pages | 508-515 |
Number of pages | 8 |
ISBN (electronic) | 978-3-031-39847-6 |
Publication status | Published - 2023 |
Publication series
Name | Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) |
---|---|
Volume | 14146 LNCS |
ISSN (Print) | 0302-9743 |
ISSN (electronic) | 1611-3349 |
Abstract
Keywords
- Knowledge Graph Completion, Natural Language Processing, Open Research Knowledge Graph, Prompt-based Question Answering, Question Answering
ASJC Scopus subject areas
- Mathematics(all)
- Theoretical Computer Science
- Computer Science(all)
Cite this
- Standard
- Harvard
- Apa
- Vancouver
- BibTeX
- RIS
Database and Expert Systems Applications . ed. / Christine Strauss; Toshiyuki Amagasa; Gabriele Kotsis; Ismail Khalil; A Min Tjoa. 2023. p. 508-515 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 14146 LNCS).
Research output: Chapter in book/report/conference proceeding › Contribution to book/anthology › Research › peer review
}
TY - CHAP
T1 - Evaluating Prompt-Based Question Answering for Object Prediction in the Open Research Knowledge Graph
AU - D’Souza, Jennifer
AU - Hrou, Moussab
AU - Auer, Sören
N1 - Publisher Copyright: © The Author(s), under exclusive license to Springer Nature Switzerland AG 2023.
PY - 2023
Y1 - 2023
N2 - Recent investigations have explored prompt-based training of transformer language models for new text genres in low-resource settings. This approach has proven effective in transferring pre-trained or fine-tuned models to resource-scarce environments. This work presents the first results on applying prompt-based training to transformers for scholarly knowledge graph object prediction. Methodologically, it stands out in two main ways: 1) it deviates from previous studies that propose entity and relation extraction pipelines, and 2) it tests the method in a significantly different domain, scholarly knowledge, evaluating linguistic, probabilistic, and factual generalizability of large-scale transformer models. Our findings demonstrate that: i) out-of-the-box transformer models underperform on the new scholarly domain, ii) prompt-based training improves performance by up to 40% in relaxed evaluation, and iii) tests of the models in a distinct domain reveals a gap in capturing domain knowledge, highlighting the need for increased attention and resources in the scholarly domain for transformer models.
AB - Recent investigations have explored prompt-based training of transformer language models for new text genres in low-resource settings. This approach has proven effective in transferring pre-trained or fine-tuned models to resource-scarce environments. This work presents the first results on applying prompt-based training to transformers for scholarly knowledge graph object prediction. Methodologically, it stands out in two main ways: 1) it deviates from previous studies that propose entity and relation extraction pipelines, and 2) it tests the method in a significantly different domain, scholarly knowledge, evaluating linguistic, probabilistic, and factual generalizability of large-scale transformer models. Our findings demonstrate that: i) out-of-the-box transformer models underperform on the new scholarly domain, ii) prompt-based training improves performance by up to 40% in relaxed evaluation, and iii) tests of the models in a distinct domain reveals a gap in capturing domain knowledge, highlighting the need for increased attention and resources in the scholarly domain for transformer models.
KW - Knowledge Graph Completion
KW - Natural Language Processing
KW - Open Research Knowledge Graph
KW - Prompt-based Question Answering
KW - Question Answering
UR - http://www.scopus.com/inward/record.url?scp=85174710939&partnerID=8YFLogxK
U2 - 10.48550/arXiv.2305.12900
DO - 10.48550/arXiv.2305.12900
M3 - Contribution to book/anthology
SN - 978-3-031-39846-9
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 508
EP - 515
BT - Database and Expert Systems Applications
A2 - Strauss, Christine
A2 - Amagasa, Toshiyuki
A2 - Kotsis, Gabriele
A2 - Khalil, Ismail
A2 - Tjoa, A Min
ER -