Details
Originalsprache | Englisch |
---|---|
Titel des Sammelwerks | 2023 IEEE 31st International Requirements Engineering Conference Workshops (REW) |
Herausgeber/-innen | Kurt Schneider, Fabiano Dalpiaz, Jennifer Horkoff |
Herausgeber (Verlag) | Institute of Electrical and Electronics Engineers Inc. |
Seiten | 102-111 |
Seitenumfang | 10 |
ISBN (elektronisch) | 9798350326918 |
ISBN (Print) | 979-8-3503-2692-5 |
Publikationsstatus | Veröffentlicht - 2023 |
Veranstaltung | 2023 IEEE 31st International Requirements Engineering Conference Workshops (REW) - Hannover, Germany, Hannover, Deutschland Dauer: 4 Sept. 2023 → 5 Sept. 2023 Konferenznummer: 31 |
Publikationsreihe
Name | IEEE International Requirements Engineering Conference Workshops |
---|---|
ISSN (Print) | 2770-6826 |
ISSN (elektronisch) | 2770-6834 |
Abstract
Explainability, i.e. the ability of a system to explain its behavior to users, has become an important quality of software-intensive systems. Recent work has focused on methods for generating explanations for various algorithmic paradigms (e.g., machine learning, self-adaptive systems). There is relatively little work on what situations and types of behavior should be explained. There is also a lack of support for eliciting explainability requirements. In this work, we explore the need for explanation expressed by users in app reviews. We manually coded a set of 1,730 app reviews from 8 apps and derived a taxonomy of Explanation Needs. We also explore several approaches to automatically identify Explanation Needs in app reviews. Our best classifier identifies Explanation Needs in 486 unseen reviews of 4 different apps with a weighted F-score of 86%. Our work contributes to a better understanding of users' Explanation Needs. Automated tools can help engineers focus on these needs and ultimately elicit valid Explanation Needs.
ASJC Scopus Sachgebiete
- Betriebswirtschaft, Management und Rechnungswesen (insg.)
- Organisationslehre und Personalmanagement
- Informatik (insg.)
- Software
- Ingenieurwesen (insg.)
- Sicherheit, Risiko, Zuverlässigkeit und Qualität
- Psychologie (insg.)
- Pädagogische und Entwicklungspsychologie
- Sozialwissenschaften (insg.)
- Ausbildung bzw. Denomination
Zitieren
- Standard
- Harvard
- Apa
- Vancouver
- BibTex
- RIS
2023 IEEE 31st International Requirements Engineering Conference Workshops (REW). Hrsg. / Kurt Schneider; Fabiano Dalpiaz; Jennifer Horkoff. Institute of Electrical and Electronics Engineers Inc., 2023. S. 102-111 (IEEE International Requirements Engineering Conference Workshops).
Publikation: Beitrag in Buch/Bericht/Sammelwerk/Konferenzband › Aufsatz in Konferenzband › Forschung › Peer-Review
}
TY - GEN
T1 - Explanation Needs in App Reviews: Taxonomy and Automated Detection
AU - Unterbusch, Max
AU - Sadeghi, Mersedeh
AU - Fischbach, Jannik
AU - Obaidi, Martin
AU - Vogelsang, Andreas
N1 - Conference code: 31
PY - 2023
Y1 - 2023
N2 - Explainability, i.e. the ability of a system to explain its behavior to users, has become an important quality of software-intensive systems. Recent work has focused on methods for generating explanations for various algorithmic paradigms (e.g., machine learning, self-adaptive systems). There is relatively little work on what situations and types of behavior should be explained. There is also a lack of support for eliciting explainability requirements. In this work, we explore the need for explanation expressed by users in app reviews. We manually coded a set of 1,730 app reviews from 8 apps and derived a taxonomy of Explanation Needs. We also explore several approaches to automatically identify Explanation Needs in app reviews. Our best classifier identifies Explanation Needs in 486 unseen reviews of 4 different apps with a weighted F-score of 86%. Our work contributes to a better understanding of users' Explanation Needs. Automated tools can help engineers focus on these needs and ultimately elicit valid Explanation Needs.
AB - Explainability, i.e. the ability of a system to explain its behavior to users, has become an important quality of software-intensive systems. Recent work has focused on methods for generating explanations for various algorithmic paradigms (e.g., machine learning, self-adaptive systems). There is relatively little work on what situations and types of behavior should be explained. There is also a lack of support for eliciting explainability requirements. In this work, we explore the need for explanation expressed by users in app reviews. We manually coded a set of 1,730 app reviews from 8 apps and derived a taxonomy of Explanation Needs. We also explore several approaches to automatically identify Explanation Needs in app reviews. Our best classifier identifies Explanation Needs in 486 unseen reviews of 4 different apps with a weighted F-score of 86%. Our work contributes to a better understanding of users' Explanation Needs. Automated tools can help engineers focus on these needs and ultimately elicit valid Explanation Needs.
KW - Explainability
KW - NLP
KW - Requirements
UR - http://www.scopus.com/inward/record.url?scp=85174711553&partnerID=8YFLogxK
U2 - 10.48550/arXiv.2307.04367
DO - 10.48550/arXiv.2307.04367
M3 - Conference contribution
AN - SCOPUS:85174711553
SN - 979-8-3503-2692-5
T3 - IEEE International Requirements Engineering Conference Workshops
SP - 102
EP - 111
BT - 2023 IEEE 31st International Requirements Engineering Conference Workshops (REW)
A2 - Schneider, Kurt
A2 - Dalpiaz, Fabiano
A2 - Horkoff, Jennifer
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 31st IEEE International Requirements Engineering Conference Workshops, REW 2023
Y2 - 4 September 2023 through 5 September 2023
ER -