Controlled Neural Sentence-Level Reframing of News Articles

Research output: Chapter in book/report/conference proceedingConference contributionResearch

Authors

External Research Organisations

  • Paderborn University
  • Bauhaus-Universität Weimar
View graph of relations

Details

Original languageEnglish
Title of host publicationFindings of the Association for Computational Linguistics
Subtitle of host publicationEMNLP 2021
EditorsMarie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-Tau Yih
Place of PublicationPunta Cana
Pages2683-2693
Number of pages11
Publication statusPublished - Nov 2021
Externally publishedYes
Event2021 Findings of the Association for Computational Linguistics, Findings of ACL: EMNLP 2021 - Punta Cana, Dominican Republic
Duration: 7 Nov 202111 Nov 2021

Abstract

Framing a news article means to portray the reported event from a specific perspective, e.g., from an economic or a health perspective. Reframing means to change this perspective. Depending on the audience or the submessage, reframing can become necessary to achieve the desired effect on the readers. Reframing is related to adapting style and sentiment, which can be tackled with neural text generation techniques. However, it is more challenging since changing a frame requires rewriting entire sentences rather than single phrases. In this paper, we study how to computationally reframe sentences in news articles while maintaining their coherence to the context. We treat reframing as a sentence-level fill-in-the-blank task for which we train neural models on an existing media frame corpus. To guide the training, we propose three strategies: framed-language pretraining, named-entity preservation, and adversarial learning. We evaluate respective models automatically and manually for topic consistency, coherence, and successful reframing. Our results indicate that generating properlyframed text works well but with tradeoffs.

ASJC Scopus subject areas

Cite this

Controlled Neural Sentence-Level Reframing of News Articles. / Chen, Wei Fan; Al-Khati, Khalid; Stein, Benno et al.
Findings of the Association for Computational Linguistics: EMNLP 2021. ed. / Marie-Francine Moens; Xuanjing Huang; Lucia Specia; Scott Wen-Tau Yih. Punta Cana, 2021. p. 2683-2693.

Research output: Chapter in book/report/conference proceedingConference contributionResearch

Chen, WF, Al-Khati, K, Stein, B & Wachsmuth, H 2021, Controlled Neural Sentence-Level Reframing of News Articles. in M-F Moens, X Huang, L Specia & SW-T Yih (eds), Findings of the Association for Computational Linguistics: EMNLP 2021. Punta Cana, pp. 2683-2693, 2021 Findings of the Association for Computational Linguistics, Findings of ACL: EMNLP 2021, Punta Cana, Dominican Republic, 7 Nov 2021. https://doi.org/10.18653/v1/2021.findings-emnlp.228
Chen, W. F., Al-Khati, K., Stein, B., & Wachsmuth, H. (2021). Controlled Neural Sentence-Level Reframing of News Articles. In M.-F. Moens, X. Huang, L. Specia, & S. W.-T. Yih (Eds.), Findings of the Association for Computational Linguistics: EMNLP 2021 (pp. 2683-2693). https://doi.org/10.18653/v1/2021.findings-emnlp.228
Chen WF, Al-Khati K, Stein B, Wachsmuth H. Controlled Neural Sentence-Level Reframing of News Articles. In Moens MF, Huang X, Specia L, Yih SWT, editors, Findings of the Association for Computational Linguistics: EMNLP 2021. Punta Cana. 2021. p. 2683-2693 doi: 10.18653/v1/2021.findings-emnlp.228
Chen, Wei Fan ; Al-Khati, Khalid ; Stein, Benno et al. / Controlled Neural Sentence-Level Reframing of News Articles. Findings of the Association for Computational Linguistics: EMNLP 2021. editor / Marie-Francine Moens ; Xuanjing Huang ; Lucia Specia ; Scott Wen-Tau Yih. Punta Cana, 2021. pp. 2683-2693
Download
@inproceedings{e253a64b1c5248a0ba4579574cc85ee3,
title = "Controlled Neural Sentence-Level Reframing of News Articles",
abstract = "Framing a news article means to portray the reported event from a specific perspective, e.g., from an economic or a health perspective. Reframing means to change this perspective. Depending on the audience or the submessage, reframing can become necessary to achieve the desired effect on the readers. Reframing is related to adapting style and sentiment, which can be tackled with neural text generation techniques. However, it is more challenging since changing a frame requires rewriting entire sentences rather than single phrases. In this paper, we study how to computationally reframe sentences in news articles while maintaining their coherence to the context. We treat reframing as a sentence-level fill-in-the-blank task for which we train neural models on an existing media frame corpus. To guide the training, we propose three strategies: framed-language pretraining, named-entity preservation, and adversarial learning. We evaluate respective models automatically and manually for topic consistency, coherence, and successful reframing. Our results indicate that generating properlyframed text works well but with tradeoffs.",
author = "Chen, {Wei Fan} and Khalid Al-Khati and Benno Stein and Henning Wachsmuth",
note = "Funding Information: This work was partially supported by the German Research Foundation (DFG) within the Collaborative Research Center {"}On-The-Fly Computing{"} (SFB 901/3) under the project number 160364472.; 2021 Findings of the Association for Computational Linguistics, Findings of ACL: EMNLP 2021 ; Conference date: 07-11-2021 Through 11-11-2021",
year = "2021",
month = nov,
doi = "10.18653/v1/2021.findings-emnlp.228",
language = "English",
isbn = "9781955917100",
pages = "2683--2693",
editor = "Marie-Francine Moens and Xuanjing Huang and Lucia Specia and Yih, {Scott Wen-Tau}",
booktitle = "Findings of the Association for Computational Linguistics",

}

Download

TY - GEN

T1 - Controlled Neural Sentence-Level Reframing of News Articles

AU - Chen, Wei Fan

AU - Al-Khati, Khalid

AU - Stein, Benno

AU - Wachsmuth, Henning

N1 - Funding Information: This work was partially supported by the German Research Foundation (DFG) within the Collaborative Research Center "On-The-Fly Computing" (SFB 901/3) under the project number 160364472.

PY - 2021/11

Y1 - 2021/11

N2 - Framing a news article means to portray the reported event from a specific perspective, e.g., from an economic or a health perspective. Reframing means to change this perspective. Depending on the audience or the submessage, reframing can become necessary to achieve the desired effect on the readers. Reframing is related to adapting style and sentiment, which can be tackled with neural text generation techniques. However, it is more challenging since changing a frame requires rewriting entire sentences rather than single phrases. In this paper, we study how to computationally reframe sentences in news articles while maintaining their coherence to the context. We treat reframing as a sentence-level fill-in-the-blank task for which we train neural models on an existing media frame corpus. To guide the training, we propose three strategies: framed-language pretraining, named-entity preservation, and adversarial learning. We evaluate respective models automatically and manually for topic consistency, coherence, and successful reframing. Our results indicate that generating properlyframed text works well but with tradeoffs.

AB - Framing a news article means to portray the reported event from a specific perspective, e.g., from an economic or a health perspective. Reframing means to change this perspective. Depending on the audience or the submessage, reframing can become necessary to achieve the desired effect on the readers. Reframing is related to adapting style and sentiment, which can be tackled with neural text generation techniques. However, it is more challenging since changing a frame requires rewriting entire sentences rather than single phrases. In this paper, we study how to computationally reframe sentences in news articles while maintaining their coherence to the context. We treat reframing as a sentence-level fill-in-the-blank task for which we train neural models on an existing media frame corpus. To guide the training, we propose three strategies: framed-language pretraining, named-entity preservation, and adversarial learning. We evaluate respective models automatically and manually for topic consistency, coherence, and successful reframing. Our results indicate that generating properlyframed text works well but with tradeoffs.

UR - http://www.scopus.com/inward/record.url?scp=85129203764&partnerID=8YFLogxK

U2 - 10.18653/v1/2021.findings-emnlp.228

DO - 10.18653/v1/2021.findings-emnlp.228

M3 - Conference contribution

AN - SCOPUS:85129203764

SN - 9781955917100

SP - 2683

EP - 2693

BT - Findings of the Association for Computational Linguistics

A2 - Moens, Marie-Francine

A2 - Huang, Xuanjing

A2 - Specia, Lucia

A2 - Yih, Scott Wen-Tau

CY - Punta Cana

T2 - 2021 Findings of the Association for Computational Linguistics, Findings of ACL: EMNLP 2021

Y2 - 7 November 2021 through 11 November 2021

ER -

By the same author(s)