Is It Worth the Attention? A Comparative Evaluation of Attention Layers for Argument Unit Segmentation

Publikation: Beitrag in Buch/Bericht/Sammelwerk/KonferenzbandAufsatz in KonferenzbandForschungPeer-Review

Autoren

Externe Organisationen

  • Universität Bremen
Forschungs-netzwerk anzeigen

Details

OriginalspracheEnglisch
Titel des SammelwerksProceedings of the 6th Workshop on Argument Mining
Herausgeber/-innenBenno Stein, Henning Wachsmuth
ErscheinungsortFlorence
Seiten74-82
Seitenumfang9
ISBN (elektronisch)9781950737338
PublikationsstatusVeröffentlicht - Aug. 2019
Veranstaltung6th Workshop on Argument Mining, ArgMining 2019 - Florence, Italien
Dauer: 1 Aug. 20191 Aug. 2019

Abstract

Attention mechanisms have seen some success for natural language processing downstream tasks in recent years and generated new state-of-the-art results. A thorough evaluation of the attention mechanism for the task of Argumentation Mining is missing. With this paper, we report a comparative evaluation of attention layers in combination with a bidirectional long short-term memory network, which is the current state-of-the-art approach for the unit segmentation task. We also compare sentence-level contextualized word embeddings to pre-generated ones. Our findings suggest that for this task, the additional attention layer does not improve the performance. In most cases, contextualized embeddings do also not show an improvement on the score achieved by pre-defined embeddings.

ASJC Scopus Sachgebiete

Zitieren

Is It Worth the Attention? A Comparative Evaluation of Attention Layers for Argument Unit Segmentation. / Spliethöver, Maximilian; Klaff, Jonas; Heuer, Hendrik.
Proceedings of the 6th Workshop on Argument Mining. Hrsg. / Benno Stein; Henning Wachsmuth. Florence, 2019. S. 74-82.

Publikation: Beitrag in Buch/Bericht/Sammelwerk/KonferenzbandAufsatz in KonferenzbandForschungPeer-Review

Spliethöver, M, Klaff, J & Heuer, H 2019, Is It Worth the Attention? A Comparative Evaluation of Attention Layers for Argument Unit Segmentation. in B Stein & H Wachsmuth (Hrsg.), Proceedings of the 6th Workshop on Argument Mining. Florence, S. 74-82, 6th Workshop on Argument Mining, ArgMining 2019, Florence, Italien, 1 Aug. 2019. https://doi.org/10.18653/v1/W19-4509
Spliethöver, M., Klaff, J., & Heuer, H. (2019). Is It Worth the Attention? A Comparative Evaluation of Attention Layers for Argument Unit Segmentation. In B. Stein, & H. Wachsmuth (Hrsg.), Proceedings of the 6th Workshop on Argument Mining (S. 74-82). https://doi.org/10.18653/v1/W19-4509
Spliethöver M, Klaff J, Heuer H. Is It Worth the Attention? A Comparative Evaluation of Attention Layers for Argument Unit Segmentation. in Stein B, Wachsmuth H, Hrsg., Proceedings of the 6th Workshop on Argument Mining. Florence. 2019. S. 74-82 doi: 10.18653/v1/W19-4509
Spliethöver, Maximilian ; Klaff, Jonas ; Heuer, Hendrik. / Is It Worth the Attention? A Comparative Evaluation of Attention Layers for Argument Unit Segmentation. Proceedings of the 6th Workshop on Argument Mining. Hrsg. / Benno Stein ; Henning Wachsmuth. Florence, 2019. S. 74-82
Download
@inproceedings{be53cca88fb94777a7ddcae7d27514ac,
title = "Is It Worth the Attention?: A Comparative Evaluation of Attention Layers for Argument Unit Segmentation",
abstract = "Attention mechanisms have seen some success for natural language processing downstream tasks in recent years and generated new state-of-the-art results. A thorough evaluation of the attention mechanism for the task of Argumentation Mining is missing. With this paper, we report a comparative evaluation of attention layers in combination with a bidirectional long short-term memory network, which is the current state-of-the-art approach for the unit segmentation task. We also compare sentence-level contextualized word embeddings to pre-generated ones. Our findings suggest that for this task, the additional attention layer does not improve the performance. In most cases, contextualized embeddings do also not show an improvement on the score achieved by pre-defined embeddings.",
author = "Maximilian Splieth{\"o}ver and Jonas Klaff and Hendrik Heuer",
note = "(c) 2019 Association for Computational Linguistics; 6th Workshop on Argument Mining, ArgMining 2019 ; Conference date: 01-08-2019 Through 01-08-2019",
year = "2019",
month = aug,
doi = "10.18653/v1/W19-4509",
language = "English",
pages = "74--82",
editor = "Benno Stein and Henning Wachsmuth",
booktitle = "Proceedings of the 6th Workshop on Argument Mining",

}

Download

TY - GEN

T1 - Is It Worth the Attention?

T2 - 6th Workshop on Argument Mining, ArgMining 2019

AU - Spliethöver, Maximilian

AU - Klaff, Jonas

AU - Heuer, Hendrik

N1 - (c) 2019 Association for Computational Linguistics

PY - 2019/8

Y1 - 2019/8

N2 - Attention mechanisms have seen some success for natural language processing downstream tasks in recent years and generated new state-of-the-art results. A thorough evaluation of the attention mechanism for the task of Argumentation Mining is missing. With this paper, we report a comparative evaluation of attention layers in combination with a bidirectional long short-term memory network, which is the current state-of-the-art approach for the unit segmentation task. We also compare sentence-level contextualized word embeddings to pre-generated ones. Our findings suggest that for this task, the additional attention layer does not improve the performance. In most cases, contextualized embeddings do also not show an improvement on the score achieved by pre-defined embeddings.

AB - Attention mechanisms have seen some success for natural language processing downstream tasks in recent years and generated new state-of-the-art results. A thorough evaluation of the attention mechanism for the task of Argumentation Mining is missing. With this paper, we report a comparative evaluation of attention layers in combination with a bidirectional long short-term memory network, which is the current state-of-the-art approach for the unit segmentation task. We also compare sentence-level contextualized word embeddings to pre-generated ones. Our findings suggest that for this task, the additional attention layer does not improve the performance. In most cases, contextualized embeddings do also not show an improvement on the score achieved by pre-defined embeddings.

UR - http://www.scopus.com/inward/record.url?scp=85102430148&partnerID=8YFLogxK

U2 - 10.18653/v1/W19-4509

DO - 10.18653/v1/W19-4509

M3 - Conference contribution

SP - 74

EP - 82

BT - Proceedings of the 6th Workshop on Argument Mining

A2 - Stein, Benno

A2 - Wachsmuth, Henning

CY - Florence

Y2 - 1 August 2019 through 1 August 2019

ER -

Von denselben Autoren