Details
Original language | English |
---|---|
Title of host publication | JCDL 2022 |
Subtitle of host publication | Proceedings of the ACM/IEEE Joint Conference on Digital Libraries in 2022 |
Publisher | Institute of Electrical and Electronics Engineers Inc. |
ISBN (electronic) | 9781450393454 |
Publication status | Published - 20 Jun 2022 |
Event | 22nd ACM/IEEE Joint Conference on Digital Libraries, JCDL 2022 - Virtual, Online, Germany Duration: 20 Jun 2022 → 24 Jun 2022 |
Publication series
Name | Proceedings of the ACM/IEEE Joint Conference on Digital Libraries |
---|---|
ISSN (Print) | 1552-5996 |
Abstract
Sequential sentence classification deals with the categorisation of sentences based on their content and context. Applied to scientific texts, it enables the automatic structuring of research papers and the improvement of academic search engines. However, previous work has not investigated the potential of transfer learning for sentence classification across different scientific domains and the issue of different text structure of full papers and abstracts. In this paper, we derive seven related research questions and present several contributions to address them: First, we suggest a novel uniform deep learning architecture and multi-Task learning for cross-domain sequential sentence classification in scientific texts. Second, we tailor two common transfer learning methods, sequential transfer learning and multi-Task learning, to deal with the challenges of the given task. Semantic relatedness of tasks is a prerequisite for successful transfer learning of neural models. Consequently, our third contribution is an approach to semi-Automatically identify semantically related classes from different annotation schemes and we present an analysis of four annotation schemes. Comprehensive experimental results indicate that models, which are trained on datasets from different scientific domains, benefit from one another when using the proposed multi-Task learning architecture. We also report comparisons with several state-of-The-Art approaches. Our approach outperforms the state of the art on full paper datasets significantly while being on par for datasets consisting of abstracts.
Keywords
- Multi-Task learning, Scholarly communication, Sequential sentence classification, Transfer learning, Zone identification
ASJC Scopus subject areas
Cite this
- Standard
- Harvard
- Apa
- Vancouver
- BibTeX
- RIS
JCDL 2022 : Proceedings of the ACM/IEEE Joint Conference on Digital Libraries in 2022. Institute of Electrical and Electronics Engineers Inc., 2022. 34 (Proceedings of the ACM/IEEE Joint Conference on Digital Libraries).
Research output: Chapter in book/report/conference proceeding › Conference contribution › Research › peer review
}
TY - GEN
T1 - Cross-Domain Multi-Task Learning for Sequential Sentence Classification in Research Papers
AU - Brack, Arthur
AU - Hoppe, Anett
AU - Buschermöhle, Pascal
AU - Ewerth, Ralph
PY - 2022/6/20
Y1 - 2022/6/20
N2 - Sequential sentence classification deals with the categorisation of sentences based on their content and context. Applied to scientific texts, it enables the automatic structuring of research papers and the improvement of academic search engines. However, previous work has not investigated the potential of transfer learning for sentence classification across different scientific domains and the issue of different text structure of full papers and abstracts. In this paper, we derive seven related research questions and present several contributions to address them: First, we suggest a novel uniform deep learning architecture and multi-Task learning for cross-domain sequential sentence classification in scientific texts. Second, we tailor two common transfer learning methods, sequential transfer learning and multi-Task learning, to deal with the challenges of the given task. Semantic relatedness of tasks is a prerequisite for successful transfer learning of neural models. Consequently, our third contribution is an approach to semi-Automatically identify semantically related classes from different annotation schemes and we present an analysis of four annotation schemes. Comprehensive experimental results indicate that models, which are trained on datasets from different scientific domains, benefit from one another when using the proposed multi-Task learning architecture. We also report comparisons with several state-of-The-Art approaches. Our approach outperforms the state of the art on full paper datasets significantly while being on par for datasets consisting of abstracts.
AB - Sequential sentence classification deals with the categorisation of sentences based on their content and context. Applied to scientific texts, it enables the automatic structuring of research papers and the improvement of academic search engines. However, previous work has not investigated the potential of transfer learning for sentence classification across different scientific domains and the issue of different text structure of full papers and abstracts. In this paper, we derive seven related research questions and present several contributions to address them: First, we suggest a novel uniform deep learning architecture and multi-Task learning for cross-domain sequential sentence classification in scientific texts. Second, we tailor two common transfer learning methods, sequential transfer learning and multi-Task learning, to deal with the challenges of the given task. Semantic relatedness of tasks is a prerequisite for successful transfer learning of neural models. Consequently, our third contribution is an approach to semi-Automatically identify semantically related classes from different annotation schemes and we present an analysis of four annotation schemes. Comprehensive experimental results indicate that models, which are trained on datasets from different scientific domains, benefit from one another when using the proposed multi-Task learning architecture. We also report comparisons with several state-of-The-Art approaches. Our approach outperforms the state of the art on full paper datasets significantly while being on par for datasets consisting of abstracts.
KW - Multi-Task learning
KW - Scholarly communication
KW - Sequential sentence classification
KW - Transfer learning
KW - Zone identification
UR - http://www.scopus.com/inward/record.url?scp=85133269042&partnerID=8YFLogxK
U2 - https://doi.org/10.48550/arXiv.2102.06008
DO - https://doi.org/10.48550/arXiv.2102.06008
M3 - Conference contribution
AN - SCOPUS:85133269042
T3 - Proceedings of the ACM/IEEE Joint Conference on Digital Libraries
BT - JCDL 2022
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 22nd ACM/IEEE Joint Conference on Digital Libraries, JCDL 2022
Y2 - 20 June 2022 through 24 June 2022
ER -