Crowdsourcing Scholarly Discourse Annotations

Publikation: Beitrag in Buch/Bericht/Sammelwerk/KonferenzbandAufsatz in KonferenzbandForschungPeer-Review

Autoren

Organisationseinheiten

Externe Organisationen

  • Technische Informationsbibliothek (TIB) Leibniz-Informationszentrum Technik und Naturwissenschaften und Universitätsbibliothek
Forschungs-netzwerk anzeigen

Details

OriginalspracheEnglisch
Titel des SammelwerksIUI '21: 26th International Conference on Intelligent User Interfaces
Herausgeber (Verlag)Association for Computing Machinery (ACM)
Seiten464-474
Seitenumfang11
ISBN (elektronisch)9781450380171
PublikationsstatusVeröffentlicht - 14 Apr. 2021
Veranstaltung26th International Conference on Intelligent User Interfaces: Where HCI Meets AI, IUI 2021 - Virtual, Online, USA / Vereinigte Staaten
Dauer: 14 Apr. 202117 Apr. 2021

Abstract

The number of scholarly publications grows steadily every year and it becomes harder to find, assess and compare scholarly knowledge effectively. Scholarly knowledge graphs have the potential to address these challenges. However, creating such graphs remains a complex task. We propose a method to crowdsource structured scholarly knowledge from paper authors with a web-based user interface supported by artificial intelligence. The interface enables authors to select key sentences for annotation. It integrates multiple machine learning algorithms to assist authors during the annotation, including class recommendation and key sentence highlighting. We envision that the interface is integrated in paper submission processes for which we define three main task requirements: The task has to be . We evaluated the interface with a user study in which participants were assigned the task to annotate one of their own articles. With the resulting data, we determined whether the participants were successfully able to perform the task. Furthermore, we evaluated the interface's usability and the participant's attitude towards the interface with a survey. The results suggest that sentence annotation is a feasible task for researchers and that they do not object to annotate their articles during the submission process.

ASJC Scopus Sachgebiete

Zitieren

Crowdsourcing Scholarly Discourse Annotations. / Oelen, Allard; Stocker, Markus; Auer, Sören.
IUI '21: 26th International Conference on Intelligent User Interfaces. Association for Computing Machinery (ACM), 2021. S. 464-474.

Publikation: Beitrag in Buch/Bericht/Sammelwerk/KonferenzbandAufsatz in KonferenzbandForschungPeer-Review

Oelen, A, Stocker, M & Auer, S 2021, Crowdsourcing Scholarly Discourse Annotations. in IUI '21: 26th International Conference on Intelligent User Interfaces. Association for Computing Machinery (ACM), S. 464-474, 26th International Conference on Intelligent User Interfaces: Where HCI Meets AI, IUI 2021, Virtual, Online, USA / Vereinigte Staaten, 14 Apr. 2021. https://doi.org/10.1145/3397481.3450685
Oelen, A., Stocker, M., & Auer, S. (2021). Crowdsourcing Scholarly Discourse Annotations. In IUI '21: 26th International Conference on Intelligent User Interfaces (S. 464-474). Association for Computing Machinery (ACM). https://doi.org/10.1145/3397481.3450685
Oelen A, Stocker M, Auer S. Crowdsourcing Scholarly Discourse Annotations. in IUI '21: 26th International Conference on Intelligent User Interfaces. Association for Computing Machinery (ACM). 2021. S. 464-474 doi: 10.1145/3397481.3450685
Oelen, Allard ; Stocker, Markus ; Auer, Sören. / Crowdsourcing Scholarly Discourse Annotations. IUI '21: 26th International Conference on Intelligent User Interfaces. Association for Computing Machinery (ACM), 2021. S. 464-474
Download
@inproceedings{5c1c664c85e548e9b520da0b1b42f878,
title = "Crowdsourcing Scholarly Discourse Annotations",
abstract = "The number of scholarly publications grows steadily every year and it becomes harder to find, assess and compare scholarly knowledge effectively. Scholarly knowledge graphs have the potential to address these challenges. However, creating such graphs remains a complex task. We propose a method to crowdsource structured scholarly knowledge from paper authors with a web-based user interface supported by artificial intelligence. The interface enables authors to select key sentences for annotation. It integrates multiple machine learning algorithms to assist authors during the annotation, including class recommendation and key sentence highlighting. We envision that the interface is integrated in paper submission processes for which we define three main task requirements: The task has to be . We evaluated the interface with a user study in which participants were assigned the task to annotate one of their own articles. With the resulting data, we determined whether the participants were successfully able to perform the task. Furthermore, we evaluated the interface's usability and the participant's attitude towards the interface with a survey. The results suggest that sentence annotation is a feasible task for researchers and that they do not object to annotate their articles during the submission process. ",
keywords = "Crowdsourcing Text Annotations, Intelligent User Interface, Knowledge Graph Construction, Structured Scholarly Knowledge, Web-based Annotation Interface",
author = "Allard Oelen and Markus Stocker and S{\"o}ren Auer",
note = "Funding Information: This work was co-funded by the European Research Council for the project ScienceGRAPH (Grant agreement ID: 819536) and the TIB Leibniz Information Centre for Science and Technology. The publication of this article was funded by the Open Access Fund of Technische Informationsbibliothek (TIB). We want to thank our colleague Mohamad Yaser Jaradeh for his contributions to this work. ; 26th International Conference on Intelligent User Interfaces: Where HCI Meets AI, IUI 2021 ; Conference date: 14-04-2021 Through 17-04-2021",
year = "2021",
month = apr,
day = "14",
doi = "10.1145/3397481.3450685",
language = "English",
pages = "464--474",
booktitle = "IUI '21: 26th International Conference on Intelligent User Interfaces",
publisher = "Association for Computing Machinery (ACM)",
address = "United States",

}

Download

TY - GEN

T1 - Crowdsourcing Scholarly Discourse Annotations

AU - Oelen, Allard

AU - Stocker, Markus

AU - Auer, Sören

N1 - Funding Information: This work was co-funded by the European Research Council for the project ScienceGRAPH (Grant agreement ID: 819536) and the TIB Leibniz Information Centre for Science and Technology. The publication of this article was funded by the Open Access Fund of Technische Informationsbibliothek (TIB). We want to thank our colleague Mohamad Yaser Jaradeh for his contributions to this work.

PY - 2021/4/14

Y1 - 2021/4/14

N2 - The number of scholarly publications grows steadily every year and it becomes harder to find, assess and compare scholarly knowledge effectively. Scholarly knowledge graphs have the potential to address these challenges. However, creating such graphs remains a complex task. We propose a method to crowdsource structured scholarly knowledge from paper authors with a web-based user interface supported by artificial intelligence. The interface enables authors to select key sentences for annotation. It integrates multiple machine learning algorithms to assist authors during the annotation, including class recommendation and key sentence highlighting. We envision that the interface is integrated in paper submission processes for which we define three main task requirements: The task has to be . We evaluated the interface with a user study in which participants were assigned the task to annotate one of their own articles. With the resulting data, we determined whether the participants were successfully able to perform the task. Furthermore, we evaluated the interface's usability and the participant's attitude towards the interface with a survey. The results suggest that sentence annotation is a feasible task for researchers and that they do not object to annotate their articles during the submission process.

AB - The number of scholarly publications grows steadily every year and it becomes harder to find, assess and compare scholarly knowledge effectively. Scholarly knowledge graphs have the potential to address these challenges. However, creating such graphs remains a complex task. We propose a method to crowdsource structured scholarly knowledge from paper authors with a web-based user interface supported by artificial intelligence. The interface enables authors to select key sentences for annotation. It integrates multiple machine learning algorithms to assist authors during the annotation, including class recommendation and key sentence highlighting. We envision that the interface is integrated in paper submission processes for which we define three main task requirements: The task has to be . We evaluated the interface with a user study in which participants were assigned the task to annotate one of their own articles. With the resulting data, we determined whether the participants were successfully able to perform the task. Furthermore, we evaluated the interface's usability and the participant's attitude towards the interface with a survey. The results suggest that sentence annotation is a feasible task for researchers and that they do not object to annotate their articles during the submission process.

KW - Crowdsourcing Text Annotations

KW - Intelligent User Interface

KW - Knowledge Graph Construction

KW - Structured Scholarly Knowledge

KW - Web-based Annotation Interface

UR - http://www.scopus.com/inward/record.url?scp=85104549080&partnerID=8YFLogxK

U2 - 10.1145/3397481.3450685

DO - 10.1145/3397481.3450685

M3 - Conference contribution

AN - SCOPUS:85104549080

SP - 464

EP - 474

BT - IUI '21: 26th International Conference on Intelligent User Interfaces

PB - Association for Computing Machinery (ACM)

T2 - 26th International Conference on Intelligent User Interfaces: Where HCI Meets AI, IUI 2021

Y2 - 14 April 2021 through 17 April 2021

ER -

Von denselben Autoren