What Can be Concluded from User Feedback? - An Empirical Study

Research output: Chapter in book/report/conference proceedingConference contributionResearchpeer review

Authors

Research Organisations

External Research Organisations

  • Heidelberg University
View graph of relations

Details

Original languageEnglish
Title of host publication2023 IEEE 31st International Requirements Engineering Conference Workshops (REW)
EditorsKurt Schneider, Fabiano Dalpiaz, Jennifer Horkoff
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages122-128
Number of pages7
ISBN (electronic)9798350326918
ISBN (print)979-8-3503-2692-5
Publication statusPublished - 2023
Event31st IEEE International Requirements Engineering Conference Workshops, REW 2023 - Hannover, Germany, Hannover, Germany
Duration: 4 Sept 20235 Sept 2023
Conference number: 31

Publication series

NameIEEE International Requirements Engineering Conference Workshops (REW)
ISSN (Print)2770-6826
ISSN (electronic)2770-6834

Abstract

Crowd-based Requirements Engineering supports capturing high amounts of user feedback in order to understand what users think about a software and which changes they are interested in. While much progress has been made in automatically classifying the feedback, it is less clear which conclusions can be drawn from feedback. Studies on utilization of feedback in industry report that companies are afraid that user feedback might be biased, because little is known about the users. In this paper, we report a preliminary empirical study with 100 participants where we asked users to give feedback on an app and also describe their comprehension of this app as an approximation of their opinion about the app as whole. We compare feedback and comprehension description by looking at the frequencies of mentioned features. We find that the feedback of frequent users (46 out of 100) differs only a little from the overall feedback. We confirm that feedback is biased in that users individually and as a group know more about the app (mentioned in their comprehension) than they tell in their feedback. However, the feedback does represent which features the users find important in their comprehension.

Keywords

    Prioritization, Software features, User comprehension, User feedback

ASJC Scopus subject areas

Cite this

What Can be Concluded from User Feedback? - An Empirical Study. / Anders, Michael; Obaidi, Martin; Specht, Alexander et al.
2023 IEEE 31st International Requirements Engineering Conference Workshops (REW). ed. / Kurt Schneider; Fabiano Dalpiaz; Jennifer Horkoff. Institute of Electrical and Electronics Engineers Inc., 2023. p. 122-128 (IEEE International Requirements Engineering Conference Workshops (REW)).

Research output: Chapter in book/report/conference proceedingConference contributionResearchpeer review

Anders, M, Obaidi, M, Specht, A & Paech, B 2023, What Can be Concluded from User Feedback? - An Empirical Study. in K Schneider, F Dalpiaz & J Horkoff (eds), 2023 IEEE 31st International Requirements Engineering Conference Workshops (REW). IEEE International Requirements Engineering Conference Workshops (REW), Institute of Electrical and Electronics Engineers Inc., pp. 122-128, 31st IEEE International Requirements Engineering Conference Workshops, REW 2023, Hannover, Lower Saxony, Germany, 4 Sept 2023. https://doi.org/10.1109/REW57809.2023.00027
Anders, M., Obaidi, M., Specht, A., & Paech, B. (2023). What Can be Concluded from User Feedback? - An Empirical Study. In K. Schneider, F. Dalpiaz, & J. Horkoff (Eds.), 2023 IEEE 31st International Requirements Engineering Conference Workshops (REW) (pp. 122-128). (IEEE International Requirements Engineering Conference Workshops (REW)). Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/REW57809.2023.00027
Anders M, Obaidi M, Specht A, Paech B. What Can be Concluded from User Feedback? - An Empirical Study. In Schneider K, Dalpiaz F, Horkoff J, editors, 2023 IEEE 31st International Requirements Engineering Conference Workshops (REW). Institute of Electrical and Electronics Engineers Inc. 2023. p. 122-128. (IEEE International Requirements Engineering Conference Workshops (REW)). doi: 10.1109/REW57809.2023.00027
Anders, Michael ; Obaidi, Martin ; Specht, Alexander et al. / What Can be Concluded from User Feedback? - An Empirical Study. 2023 IEEE 31st International Requirements Engineering Conference Workshops (REW). editor / Kurt Schneider ; Fabiano Dalpiaz ; Jennifer Horkoff. Institute of Electrical and Electronics Engineers Inc., 2023. pp. 122-128 (IEEE International Requirements Engineering Conference Workshops (REW)).
Download
@inproceedings{83a1d4fca4904b30b828776da5155c95,
title = "What Can be Concluded from User Feedback? - An Empirical Study",
abstract = "Crowd-based Requirements Engineering supports capturing high amounts of user feedback in order to understand what users think about a software and which changes they are interested in. While much progress has been made in automatically classifying the feedback, it is less clear which conclusions can be drawn from feedback. Studies on utilization of feedback in industry report that companies are afraid that user feedback might be biased, because little is known about the users. In this paper, we report a preliminary empirical study with 100 participants where we asked users to give feedback on an app and also describe their comprehension of this app as an approximation of their opinion about the app as whole. We compare feedback and comprehension description by looking at the frequencies of mentioned features. We find that the feedback of frequent users (46 out of 100) differs only a little from the overall feedback. We confirm that feedback is biased in that users individually and as a group know more about the app (mentioned in their comprehension) than they tell in their feedback. However, the feedback does represent which features the users find important in their comprehension.",
keywords = "Prioritization, Software features, User comprehension, User feedback",
author = "Michael Anders and Martin Obaidi and Alexander Specht and Barbara Paech",
year = "2023",
doi = "10.1109/REW57809.2023.00027",
language = "English",
isbn = "979-8-3503-2692-5",
series = "IEEE International Requirements Engineering Conference Workshops (REW)",
publisher = "Institute of Electrical and Electronics Engineers Inc.",
pages = "122--128",
editor = "Kurt Schneider and Fabiano Dalpiaz and Jennifer Horkoff",
booktitle = "2023 IEEE 31st International Requirements Engineering Conference Workshops (REW)",
address = "United States",
note = "31st IEEE International Requirements Engineering Conference Workshops, REW 2023 ; Conference date: 04-09-2023 Through 05-09-2023",

}

Download

TY - GEN

T1 - What Can be Concluded from User Feedback? - An Empirical Study

AU - Anders, Michael

AU - Obaidi, Martin

AU - Specht, Alexander

AU - Paech, Barbara

N1 - Conference code: 31

PY - 2023

Y1 - 2023

N2 - Crowd-based Requirements Engineering supports capturing high amounts of user feedback in order to understand what users think about a software and which changes they are interested in. While much progress has been made in automatically classifying the feedback, it is less clear which conclusions can be drawn from feedback. Studies on utilization of feedback in industry report that companies are afraid that user feedback might be biased, because little is known about the users. In this paper, we report a preliminary empirical study with 100 participants where we asked users to give feedback on an app and also describe their comprehension of this app as an approximation of their opinion about the app as whole. We compare feedback and comprehension description by looking at the frequencies of mentioned features. We find that the feedback of frequent users (46 out of 100) differs only a little from the overall feedback. We confirm that feedback is biased in that users individually and as a group know more about the app (mentioned in their comprehension) than they tell in their feedback. However, the feedback does represent which features the users find important in their comprehension.

AB - Crowd-based Requirements Engineering supports capturing high amounts of user feedback in order to understand what users think about a software and which changes they are interested in. While much progress has been made in automatically classifying the feedback, it is less clear which conclusions can be drawn from feedback. Studies on utilization of feedback in industry report that companies are afraid that user feedback might be biased, because little is known about the users. In this paper, we report a preliminary empirical study with 100 participants where we asked users to give feedback on an app and also describe their comprehension of this app as an approximation of their opinion about the app as whole. We compare feedback and comprehension description by looking at the frequencies of mentioned features. We find that the feedback of frequent users (46 out of 100) differs only a little from the overall feedback. We confirm that feedback is biased in that users individually and as a group know more about the app (mentioned in their comprehension) than they tell in their feedback. However, the feedback does represent which features the users find important in their comprehension.

KW - Prioritization

KW - Software features

KW - User comprehension

KW - User feedback

UR - http://www.scopus.com/inward/record.url?scp=85174708604&partnerID=8YFLogxK

U2 - 10.1109/REW57809.2023.00027

DO - 10.1109/REW57809.2023.00027

M3 - Conference contribution

AN - SCOPUS:85174708604

SN - 979-8-3503-2692-5

T3 - IEEE International Requirements Engineering Conference Workshops (REW)

SP - 122

EP - 128

BT - 2023 IEEE 31st International Requirements Engineering Conference Workshops (REW)

A2 - Schneider, Kurt

A2 - Dalpiaz, Fabiano

A2 - Horkoff, Jennifer

PB - Institute of Electrical and Electronics Engineers Inc.

T2 - 31st IEEE International Requirements Engineering Conference Workshops, REW 2023

Y2 - 4 September 2023 through 5 September 2023

ER -

By the same author(s)