Details
Originalsprache | Englisch |
---|---|
Titel des Sammelwerks | Proceedings of the 6th Workshop on Argument Mining |
Herausgeber/-innen | Benno Stein, Henning Wachsmuth |
Erscheinungsort | Florence |
Seiten | 74-82 |
Seitenumfang | 9 |
ISBN (elektronisch) | 9781950737338 |
Publikationsstatus | Veröffentlicht - Aug. 2019 |
Veranstaltung | 6th Workshop on Argument Mining, ArgMining 2019 - Florence, Italien Dauer: 1 Aug. 2019 → 1 Aug. 2019 |
Abstract
ASJC Scopus Sachgebiete
- Informatik (insg.)
- Software
- Geisteswissenschaftliche Fächer (insg.)
- Sprache und Linguistik
- Sozialwissenschaften (insg.)
- Linguistik und Sprache
Zitieren
- Standard
- Harvard
- Apa
- Vancouver
- BibTex
- RIS
Proceedings of the 6th Workshop on Argument Mining. Hrsg. / Benno Stein; Henning Wachsmuth. Florence, 2019. S. 74-82.
Publikation: Beitrag in Buch/Bericht/Sammelwerk/Konferenzband › Aufsatz in Konferenzband › Forschung › Peer-Review
}
TY - GEN
T1 - Is It Worth the Attention?
T2 - 6th Workshop on Argument Mining, ArgMining 2019
AU - Spliethöver, Maximilian
AU - Klaff, Jonas
AU - Heuer, Hendrik
N1 - (c) 2019 Association for Computational Linguistics
PY - 2019/8
Y1 - 2019/8
N2 - Attention mechanisms have seen some success for natural language processing downstream tasks in recent years and generated new state-of-the-art results. A thorough evaluation of the attention mechanism for the task of Argumentation Mining is missing. With this paper, we report a comparative evaluation of attention layers in combination with a bidirectional long short-term memory network, which is the current state-of-the-art approach for the unit segmentation task. We also compare sentence-level contextualized word embeddings to pre-generated ones. Our findings suggest that for this task, the additional attention layer does not improve the performance. In most cases, contextualized embeddings do also not show an improvement on the score achieved by pre-defined embeddings.
AB - Attention mechanisms have seen some success for natural language processing downstream tasks in recent years and generated new state-of-the-art results. A thorough evaluation of the attention mechanism for the task of Argumentation Mining is missing. With this paper, we report a comparative evaluation of attention layers in combination with a bidirectional long short-term memory network, which is the current state-of-the-art approach for the unit segmentation task. We also compare sentence-level contextualized word embeddings to pre-generated ones. Our findings suggest that for this task, the additional attention layer does not improve the performance. In most cases, contextualized embeddings do also not show an improvement on the score achieved by pre-defined embeddings.
UR - http://www.scopus.com/inward/record.url?scp=85102430148&partnerID=8YFLogxK
U2 - 10.18653/v1/W19-4509
DO - 10.18653/v1/W19-4509
M3 - Conference contribution
SP - 74
EP - 82
BT - Proceedings of the 6th Workshop on Argument Mining
A2 - Stein, Benno
A2 - Wachsmuth, Henning
CY - Florence
Y2 - 1 August 2019 through 1 August 2019
ER -