What Can be Concluded from User Feedback? - An Empirical Study

Publikation: Beitrag in Buch/Bericht/Sammelwerk/KonferenzbandAufsatz in KonferenzbandForschungPeer-Review

Autoren

Organisationseinheiten

Externe Organisationen

  • Ruprecht-Karls-Universität Heidelberg
Forschungs-netzwerk anzeigen

Details

OriginalspracheEnglisch
Titel des Sammelwerks2023 IEEE 31st International Requirements Engineering Conference Workshops (REW)
Herausgeber/-innenKurt Schneider, Fabiano Dalpiaz, Jennifer Horkoff
Herausgeber (Verlag)Institute of Electrical and Electronics Engineers Inc.
Seiten122-128
Seitenumfang7
ISBN (elektronisch)9798350326918
ISBN (Print)979-8-3503-2692-5
PublikationsstatusVeröffentlicht - 2023
Veranstaltung2023 IEEE 31st International Requirements Engineering Conference Workshops (REW) - Hannover, Germany, Hannover, Deutschland
Dauer: 4 Sept. 20235 Sept. 2023
Konferenznummer: 31

Publikationsreihe

NameIEEE International Requirements Engineering Conference Workshops (REW)
ISSN (Print)2770-6826
ISSN (elektronisch)2770-6834

Abstract

Crowd-based Requirements Engineering supports capturing high amounts of user feedback in order to understand what users think about a software and which changes they are interested in. While much progress has been made in automatically classifying the feedback, it is less clear which conclusions can be drawn from feedback. Studies on utilization of feedback in industry report that companies are afraid that user feedback might be biased, because little is known about the users. In this paper, we report a preliminary empirical study with 100 participants where we asked users to give feedback on an app and also describe their comprehension of this app as an approximation of their opinion about the app as whole. We compare feedback and comprehension description by looking at the frequencies of mentioned features. We find that the feedback of frequent users (46 out of 100) differs only a little from the overall feedback. We confirm that feedback is biased in that users individually and as a group know more about the app (mentioned in their comprehension) than they tell in their feedback. However, the feedback does represent which features the users find important in their comprehension.

ASJC Scopus Sachgebiete

Zitieren

What Can be Concluded from User Feedback? - An Empirical Study. / Anders, Michael; Obaidi, Martin; Specht, Alexander et al.
2023 IEEE 31st International Requirements Engineering Conference Workshops (REW). Hrsg. / Kurt Schneider; Fabiano Dalpiaz; Jennifer Horkoff. Institute of Electrical and Electronics Engineers Inc., 2023. S. 122-128 (IEEE International Requirements Engineering Conference Workshops (REW)).

Publikation: Beitrag in Buch/Bericht/Sammelwerk/KonferenzbandAufsatz in KonferenzbandForschungPeer-Review

Anders, M, Obaidi, M, Specht, A & Paech, B 2023, What Can be Concluded from User Feedback? - An Empirical Study. in K Schneider, F Dalpiaz & J Horkoff (Hrsg.), 2023 IEEE 31st International Requirements Engineering Conference Workshops (REW). IEEE International Requirements Engineering Conference Workshops (REW), Institute of Electrical and Electronics Engineers Inc., S. 122-128, 2023 IEEE 31st International Requirements Engineering Conference Workshops (REW), Hannover, Niedersachsen, Deutschland, 4 Sept. 2023. https://doi.org/10.1109/REW57809.2023.00027
Anders, M., Obaidi, M., Specht, A., & Paech, B. (2023). What Can be Concluded from User Feedback? - An Empirical Study. In K. Schneider, F. Dalpiaz, & J. Horkoff (Hrsg.), 2023 IEEE 31st International Requirements Engineering Conference Workshops (REW) (S. 122-128). (IEEE International Requirements Engineering Conference Workshops (REW)). Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/REW57809.2023.00027
Anders M, Obaidi M, Specht A, Paech B. What Can be Concluded from User Feedback? - An Empirical Study. in Schneider K, Dalpiaz F, Horkoff J, Hrsg., 2023 IEEE 31st International Requirements Engineering Conference Workshops (REW). Institute of Electrical and Electronics Engineers Inc. 2023. S. 122-128. (IEEE International Requirements Engineering Conference Workshops (REW)). doi: 10.1109/REW57809.2023.00027
Anders, Michael ; Obaidi, Martin ; Specht, Alexander et al. / What Can be Concluded from User Feedback? - An Empirical Study. 2023 IEEE 31st International Requirements Engineering Conference Workshops (REW). Hrsg. / Kurt Schneider ; Fabiano Dalpiaz ; Jennifer Horkoff. Institute of Electrical and Electronics Engineers Inc., 2023. S. 122-128 (IEEE International Requirements Engineering Conference Workshops (REW)).
Download
@inproceedings{83a1d4fca4904b30b828776da5155c95,
title = "What Can be Concluded from User Feedback? - An Empirical Study",
abstract = "Crowd-based Requirements Engineering supports capturing high amounts of user feedback in order to understand what users think about a software and which changes they are interested in. While much progress has been made in automatically classifying the feedback, it is less clear which conclusions can be drawn from feedback. Studies on utilization of feedback in industry report that companies are afraid that user feedback might be biased, because little is known about the users. In this paper, we report a preliminary empirical study with 100 participants where we asked users to give feedback on an app and also describe their comprehension of this app as an approximation of their opinion about the app as whole. We compare feedback and comprehension description by looking at the frequencies of mentioned features. We find that the feedback of frequent users (46 out of 100) differs only a little from the overall feedback. We confirm that feedback is biased in that users individually and as a group know more about the app (mentioned in their comprehension) than they tell in their feedback. However, the feedback does represent which features the users find important in their comprehension.",
keywords = "Prioritization, Software features, User comprehension, User feedback",
author = "Michael Anders and Martin Obaidi and Alexander Specht and Barbara Paech",
year = "2023",
doi = "10.1109/REW57809.2023.00027",
language = "English",
isbn = "979-8-3503-2692-5",
series = "IEEE International Requirements Engineering Conference Workshops (REW)",
publisher = "Institute of Electrical and Electronics Engineers Inc.",
pages = "122--128",
editor = "Kurt Schneider and Fabiano Dalpiaz and Jennifer Horkoff",
booktitle = "2023 IEEE 31st International Requirements Engineering Conference Workshops (REW)",
address = "United States",
note = "31st IEEE International Requirements Engineering Conference Workshops, REW 2023 ; Conference date: 04-09-2023 Through 05-09-2023",

}

Download

TY - GEN

T1 - What Can be Concluded from User Feedback? - An Empirical Study

AU - Anders, Michael

AU - Obaidi, Martin

AU - Specht, Alexander

AU - Paech, Barbara

N1 - Conference code: 31

PY - 2023

Y1 - 2023

N2 - Crowd-based Requirements Engineering supports capturing high amounts of user feedback in order to understand what users think about a software and which changes they are interested in. While much progress has been made in automatically classifying the feedback, it is less clear which conclusions can be drawn from feedback. Studies on utilization of feedback in industry report that companies are afraid that user feedback might be biased, because little is known about the users. In this paper, we report a preliminary empirical study with 100 participants where we asked users to give feedback on an app and also describe their comprehension of this app as an approximation of their opinion about the app as whole. We compare feedback and comprehension description by looking at the frequencies of mentioned features. We find that the feedback of frequent users (46 out of 100) differs only a little from the overall feedback. We confirm that feedback is biased in that users individually and as a group know more about the app (mentioned in their comprehension) than they tell in their feedback. However, the feedback does represent which features the users find important in their comprehension.

AB - Crowd-based Requirements Engineering supports capturing high amounts of user feedback in order to understand what users think about a software and which changes they are interested in. While much progress has been made in automatically classifying the feedback, it is less clear which conclusions can be drawn from feedback. Studies on utilization of feedback in industry report that companies are afraid that user feedback might be biased, because little is known about the users. In this paper, we report a preliminary empirical study with 100 participants where we asked users to give feedback on an app and also describe their comprehension of this app as an approximation of their opinion about the app as whole. We compare feedback and comprehension description by looking at the frequencies of mentioned features. We find that the feedback of frequent users (46 out of 100) differs only a little from the overall feedback. We confirm that feedback is biased in that users individually and as a group know more about the app (mentioned in their comprehension) than they tell in their feedback. However, the feedback does represent which features the users find important in their comprehension.

KW - Prioritization

KW - Software features

KW - User comprehension

KW - User feedback

UR - http://www.scopus.com/inward/record.url?scp=85174708604&partnerID=8YFLogxK

U2 - 10.1109/REW57809.2023.00027

DO - 10.1109/REW57809.2023.00027

M3 - Conference contribution

AN - SCOPUS:85174708604

SN - 979-8-3503-2692-5

T3 - IEEE International Requirements Engineering Conference Workshops (REW)

SP - 122

EP - 128

BT - 2023 IEEE 31st International Requirements Engineering Conference Workshops (REW)

A2 - Schneider, Kurt

A2 - Dalpiaz, Fabiano

A2 - Horkoff, Jennifer

PB - Institute of Electrical and Electronics Engineers Inc.

T2 - 31st IEEE International Requirements Engineering Conference Workshops, REW 2023

Y2 - 4 September 2023 through 5 September 2023

ER -

Von denselben Autoren