Cross-Domain Multi-Task Learning for Sequential Sentence Classification in Research Papers

Publikation: Beitrag in Buch/Bericht/Sammelwerk/KonferenzbandAufsatz in KonferenzbandForschungPeer-Review

Autoren

  • Arthur Brack
  • Anett Hoppe
  • Pascal Buschermöhle
  • Ralph Ewerth

Organisationseinheiten

Externe Organisationen

  • Technische Informationsbibliothek (TIB) Leibniz-Informationszentrum Technik und Naturwissenschaften und Universitätsbibliothek
Forschungs-netzwerk anzeigen

Details

OriginalspracheEnglisch
Titel des SammelwerksJCDL 2022
UntertitelProceedings of the ACM/IEEE Joint Conference on Digital Libraries in 2022
Herausgeber (Verlag)Institute of Electrical and Electronics Engineers Inc.
ISBN (elektronisch)9781450393454
PublikationsstatusVeröffentlicht - 20 Juni 2022
Veranstaltung22nd ACM/IEEE Joint Conference on Digital Libraries, JCDL 2022 - Virtual, Online, Deutschland
Dauer: 20 Juni 202224 Juni 2022

Publikationsreihe

NameProceedings of the ACM/IEEE Joint Conference on Digital Libraries
ISSN (Print)1552-5996

Abstract

Sequential sentence classification deals with the categorisation of sentences based on their content and context. Applied to scientific texts, it enables the automatic structuring of research papers and the improvement of academic search engines. However, previous work has not investigated the potential of transfer learning for sentence classification across different scientific domains and the issue of different text structure of full papers and abstracts. In this paper, we derive seven related research questions and present several contributions to address them: First, we suggest a novel uniform deep learning architecture and multi-Task learning for cross-domain sequential sentence classification in scientific texts. Second, we tailor two common transfer learning methods, sequential transfer learning and multi-Task learning, to deal with the challenges of the given task. Semantic relatedness of tasks is a prerequisite for successful transfer learning of neural models. Consequently, our third contribution is an approach to semi-Automatically identify semantically related classes from different annotation schemes and we present an analysis of four annotation schemes. Comprehensive experimental results indicate that models, which are trained on datasets from different scientific domains, benefit from one another when using the proposed multi-Task learning architecture. We also report comparisons with several state-of-The-Art approaches. Our approach outperforms the state of the art on full paper datasets significantly while being on par for datasets consisting of abstracts.

ASJC Scopus Sachgebiete

Zitieren

Cross-Domain Multi-Task Learning for Sequential Sentence Classification in Research Papers. / Brack, Arthur; Hoppe, Anett; Buschermöhle, Pascal et al.
JCDL 2022 : Proceedings of the ACM/IEEE Joint Conference on Digital Libraries in 2022. Institute of Electrical and Electronics Engineers Inc., 2022. 34 (Proceedings of the ACM/IEEE Joint Conference on Digital Libraries).

Publikation: Beitrag in Buch/Bericht/Sammelwerk/KonferenzbandAufsatz in KonferenzbandForschungPeer-Review

Brack, A, Hoppe, A, Buschermöhle, P & Ewerth, R 2022, Cross-Domain Multi-Task Learning for Sequential Sentence Classification in Research Papers. in JCDL 2022 : Proceedings of the ACM/IEEE Joint Conference on Digital Libraries in 2022., 34, Proceedings of the ACM/IEEE Joint Conference on Digital Libraries, Institute of Electrical and Electronics Engineers Inc., 22nd ACM/IEEE Joint Conference on Digital Libraries, JCDL 2022, Virtual, Online, Deutschland, 20 Juni 2022. https://doi.org/10.48550/arXiv.2102.06008, https://doi.org/10.1145/3529372.3530922
Brack, A., Hoppe, A., Buschermöhle, P., & Ewerth, R. (2022). Cross-Domain Multi-Task Learning for Sequential Sentence Classification in Research Papers. In JCDL 2022 : Proceedings of the ACM/IEEE Joint Conference on Digital Libraries in 2022 Artikel 34 (Proceedings of the ACM/IEEE Joint Conference on Digital Libraries). Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.48550/arXiv.2102.06008, https://doi.org/10.1145/3529372.3530922
Brack A, Hoppe A, Buschermöhle P, Ewerth R. Cross-Domain Multi-Task Learning for Sequential Sentence Classification in Research Papers. in JCDL 2022 : Proceedings of the ACM/IEEE Joint Conference on Digital Libraries in 2022. Institute of Electrical and Electronics Engineers Inc. 2022. 34. (Proceedings of the ACM/IEEE Joint Conference on Digital Libraries). doi: https://doi.org/10.48550/arXiv.2102.06008, 10.1145/3529372.3530922
Brack, Arthur ; Hoppe, Anett ; Buschermöhle, Pascal et al. / Cross-Domain Multi-Task Learning for Sequential Sentence Classification in Research Papers. JCDL 2022 : Proceedings of the ACM/IEEE Joint Conference on Digital Libraries in 2022. Institute of Electrical and Electronics Engineers Inc., 2022. (Proceedings of the ACM/IEEE Joint Conference on Digital Libraries).
Download
@inproceedings{a319362e40e240479fc42e2e256514f5,
title = "Cross-Domain Multi-Task Learning for Sequential Sentence Classification in Research Papers",
abstract = "Sequential sentence classification deals with the categorisation of sentences based on their content and context. Applied to scientific texts, it enables the automatic structuring of research papers and the improvement of academic search engines. However, previous work has not investigated the potential of transfer learning for sentence classification across different scientific domains and the issue of different text structure of full papers and abstracts. In this paper, we derive seven related research questions and present several contributions to address them: First, we suggest a novel uniform deep learning architecture and multi-Task learning for cross-domain sequential sentence classification in scientific texts. Second, we tailor two common transfer learning methods, sequential transfer learning and multi-Task learning, to deal with the challenges of the given task. Semantic relatedness of tasks is a prerequisite for successful transfer learning of neural models. Consequently, our third contribution is an approach to semi-Automatically identify semantically related classes from different annotation schemes and we present an analysis of four annotation schemes. Comprehensive experimental results indicate that models, which are trained on datasets from different scientific domains, benefit from one another when using the proposed multi-Task learning architecture. We also report comparisons with several state-of-The-Art approaches. Our approach outperforms the state of the art on full paper datasets significantly while being on par for datasets consisting of abstracts.",
keywords = "Multi-Task learning, Scholarly communication, Sequential sentence classification, Transfer learning, Zone identification",
author = "Arthur Brack and Anett Hoppe and Pascal Buscherm{\"o}hle and Ralph Ewerth",
year = "2022",
month = jun,
day = "20",
doi = "https://doi.org/10.48550/arXiv.2102.06008",
language = "English",
series = "Proceedings of the ACM/IEEE Joint Conference on Digital Libraries",
publisher = "Institute of Electrical and Electronics Engineers Inc.",
booktitle = "JCDL 2022",
address = "United States",
note = "22nd ACM/IEEE Joint Conference on Digital Libraries, JCDL 2022 ; Conference date: 20-06-2022 Through 24-06-2022",

}

Download

TY - GEN

T1 - Cross-Domain Multi-Task Learning for Sequential Sentence Classification in Research Papers

AU - Brack, Arthur

AU - Hoppe, Anett

AU - Buschermöhle, Pascal

AU - Ewerth, Ralph

PY - 2022/6/20

Y1 - 2022/6/20

N2 - Sequential sentence classification deals with the categorisation of sentences based on their content and context. Applied to scientific texts, it enables the automatic structuring of research papers and the improvement of academic search engines. However, previous work has not investigated the potential of transfer learning for sentence classification across different scientific domains and the issue of different text structure of full papers and abstracts. In this paper, we derive seven related research questions and present several contributions to address them: First, we suggest a novel uniform deep learning architecture and multi-Task learning for cross-domain sequential sentence classification in scientific texts. Second, we tailor two common transfer learning methods, sequential transfer learning and multi-Task learning, to deal with the challenges of the given task. Semantic relatedness of tasks is a prerequisite for successful transfer learning of neural models. Consequently, our third contribution is an approach to semi-Automatically identify semantically related classes from different annotation schemes and we present an analysis of four annotation schemes. Comprehensive experimental results indicate that models, which are trained on datasets from different scientific domains, benefit from one another when using the proposed multi-Task learning architecture. We also report comparisons with several state-of-The-Art approaches. Our approach outperforms the state of the art on full paper datasets significantly while being on par for datasets consisting of abstracts.

AB - Sequential sentence classification deals with the categorisation of sentences based on their content and context. Applied to scientific texts, it enables the automatic structuring of research papers and the improvement of academic search engines. However, previous work has not investigated the potential of transfer learning for sentence classification across different scientific domains and the issue of different text structure of full papers and abstracts. In this paper, we derive seven related research questions and present several contributions to address them: First, we suggest a novel uniform deep learning architecture and multi-Task learning for cross-domain sequential sentence classification in scientific texts. Second, we tailor two common transfer learning methods, sequential transfer learning and multi-Task learning, to deal with the challenges of the given task. Semantic relatedness of tasks is a prerequisite for successful transfer learning of neural models. Consequently, our third contribution is an approach to semi-Automatically identify semantically related classes from different annotation schemes and we present an analysis of four annotation schemes. Comprehensive experimental results indicate that models, which are trained on datasets from different scientific domains, benefit from one another when using the proposed multi-Task learning architecture. We also report comparisons with several state-of-The-Art approaches. Our approach outperforms the state of the art on full paper datasets significantly while being on par for datasets consisting of abstracts.

KW - Multi-Task learning

KW - Scholarly communication

KW - Sequential sentence classification

KW - Transfer learning

KW - Zone identification

UR - http://www.scopus.com/inward/record.url?scp=85133269042&partnerID=8YFLogxK

U2 - https://doi.org/10.48550/arXiv.2102.06008

DO - https://doi.org/10.48550/arXiv.2102.06008

M3 - Conference contribution

AN - SCOPUS:85133269042

T3 - Proceedings of the ACM/IEEE Joint Conference on Digital Libraries

BT - JCDL 2022

PB - Institute of Electrical and Electronics Engineers Inc.

T2 - 22nd ACM/IEEE Joint Conference on Digital Libraries, JCDL 2022

Y2 - 20 June 2022 through 24 June 2022

ER -