Is It Worth the Attention? A Comparative Evaluation of Attention Layers for Argument Unit Segmentation

Research output: Chapter in book/report/conference proceedingConference contributionResearchpeer review

Authors

Research Organisations

External Research Organisations

  • University of Bremen
View graph of relations

Details

Original languageEnglish
Title of host publicationProceedings of the 6th Workshop on Argument Mining
EditorsBenno Stein, Henning Wachsmuth
Place of PublicationFlorence
Pages74-82
Number of pages9
ISBN (electronic)9781950737338
Publication statusPublished - Aug 2019
Event6th Workshop on Argument Mining, ArgMining 2019 - Florence, Italy
Duration: 1 Aug 20191 Aug 2019

Abstract

Attention mechanisms have seen some success for natural language processing downstream tasks in recent years and generated new state-of-the-art results. A thorough evaluation of the attention mechanism for the task of Argumentation Mining is missing. With this paper, we report a comparative evaluation of attention layers in combination with a bidirectional long short-term memory network, which is the current state-of-the-art approach for the unit segmentation task. We also compare sentence-level contextualized word embeddings to pre-generated ones. Our findings suggest that for this task, the additional attention layer does not improve the performance. In most cases, contextualized embeddings do also not show an improvement on the score achieved by pre-defined embeddings.

ASJC Scopus subject areas

Cite this

Is It Worth the Attention? A Comparative Evaluation of Attention Layers for Argument Unit Segmentation. / Spliethöver, Maximilian; Klaff, Jonas; Heuer, Hendrik.
Proceedings of the 6th Workshop on Argument Mining. ed. / Benno Stein; Henning Wachsmuth. Florence, 2019. p. 74-82.

Research output: Chapter in book/report/conference proceedingConference contributionResearchpeer review

Spliethöver, M, Klaff, J & Heuer, H 2019, Is It Worth the Attention? A Comparative Evaluation of Attention Layers for Argument Unit Segmentation. in B Stein & H Wachsmuth (eds), Proceedings of the 6th Workshop on Argument Mining. Florence, pp. 74-82, 6th Workshop on Argument Mining, ArgMining 2019, Florence, Italy, 1 Aug 2019. https://doi.org/10.18653/v1/W19-4509
Spliethöver, M., Klaff, J., & Heuer, H. (2019). Is It Worth the Attention? A Comparative Evaluation of Attention Layers for Argument Unit Segmentation. In B. Stein, & H. Wachsmuth (Eds.), Proceedings of the 6th Workshop on Argument Mining (pp. 74-82). https://doi.org/10.18653/v1/W19-4509
Spliethöver M, Klaff J, Heuer H. Is It Worth the Attention? A Comparative Evaluation of Attention Layers for Argument Unit Segmentation. In Stein B, Wachsmuth H, editors, Proceedings of the 6th Workshop on Argument Mining. Florence. 2019. p. 74-82 doi: 10.18653/v1/W19-4509
Spliethöver, Maximilian ; Klaff, Jonas ; Heuer, Hendrik. / Is It Worth the Attention? A Comparative Evaluation of Attention Layers for Argument Unit Segmentation. Proceedings of the 6th Workshop on Argument Mining. editor / Benno Stein ; Henning Wachsmuth. Florence, 2019. pp. 74-82
Download
@inproceedings{be53cca88fb94777a7ddcae7d27514ac,
title = "Is It Worth the Attention?: A Comparative Evaluation of Attention Layers for Argument Unit Segmentation",
abstract = "Attention mechanisms have seen some success for natural language processing downstream tasks in recent years and generated new state-of-the-art results. A thorough evaluation of the attention mechanism for the task of Argumentation Mining is missing. With this paper, we report a comparative evaluation of attention layers in combination with a bidirectional long short-term memory network, which is the current state-of-the-art approach for the unit segmentation task. We also compare sentence-level contextualized word embeddings to pre-generated ones. Our findings suggest that for this task, the additional attention layer does not improve the performance. In most cases, contextualized embeddings do also not show an improvement on the score achieved by pre-defined embeddings.",
author = "Maximilian Splieth{\"o}ver and Jonas Klaff and Hendrik Heuer",
note = "(c) 2019 Association for Computational Linguistics; 6th Workshop on Argument Mining, ArgMining 2019 ; Conference date: 01-08-2019 Through 01-08-2019",
year = "2019",
month = aug,
doi = "10.18653/v1/W19-4509",
language = "English",
pages = "74--82",
editor = "Benno Stein and Henning Wachsmuth",
booktitle = "Proceedings of the 6th Workshop on Argument Mining",

}

Download

TY - GEN

T1 - Is It Worth the Attention?

T2 - 6th Workshop on Argument Mining, ArgMining 2019

AU - Spliethöver, Maximilian

AU - Klaff, Jonas

AU - Heuer, Hendrik

N1 - (c) 2019 Association for Computational Linguistics

PY - 2019/8

Y1 - 2019/8

N2 - Attention mechanisms have seen some success for natural language processing downstream tasks in recent years and generated new state-of-the-art results. A thorough evaluation of the attention mechanism for the task of Argumentation Mining is missing. With this paper, we report a comparative evaluation of attention layers in combination with a bidirectional long short-term memory network, which is the current state-of-the-art approach for the unit segmentation task. We also compare sentence-level contextualized word embeddings to pre-generated ones. Our findings suggest that for this task, the additional attention layer does not improve the performance. In most cases, contextualized embeddings do also not show an improvement on the score achieved by pre-defined embeddings.

AB - Attention mechanisms have seen some success for natural language processing downstream tasks in recent years and generated new state-of-the-art results. A thorough evaluation of the attention mechanism for the task of Argumentation Mining is missing. With this paper, we report a comparative evaluation of attention layers in combination with a bidirectional long short-term memory network, which is the current state-of-the-art approach for the unit segmentation task. We also compare sentence-level contextualized word embeddings to pre-generated ones. Our findings suggest that for this task, the additional attention layer does not improve the performance. In most cases, contextualized embeddings do also not show an improvement on the score achieved by pre-defined embeddings.

UR - http://www.scopus.com/inward/record.url?scp=85102430148&partnerID=8YFLogxK

U2 - 10.18653/v1/W19-4509

DO - 10.18653/v1/W19-4509

M3 - Conference contribution

SP - 74

EP - 82

BT - Proceedings of the 6th Workshop on Argument Mining

A2 - Stein, Benno

A2 - Wachsmuth, Henning

CY - Florence

Y2 - 1 August 2019 through 1 August 2019

ER -

By the same author(s)