Details
Original language | English |
---|---|
Title of host publication | 2023 IEEE 31st International Requirements Engineering Conference Workshops (REW) |
Editors | Kurt Schneider, Fabiano Dalpiaz, Jennifer Horkoff |
Publisher | Institute of Electrical and Electronics Engineers Inc. |
Pages | 122-128 |
Number of pages | 7 |
ISBN (electronic) | 9798350326918 |
ISBN (print) | 979-8-3503-2692-5 |
Publication status | Published - 2023 |
Event | 31st IEEE International Requirements Engineering Conference Workshops, REW 2023 - Hannover, Germany, Hannover, Germany Duration: 4 Sept 2023 → 5 Sept 2023 Conference number: 31 |
Publication series
Name | IEEE International Requirements Engineering Conference Workshops (REW) |
---|---|
ISSN (Print) | 2770-6826 |
ISSN (electronic) | 2770-6834 |
Abstract
Crowd-based Requirements Engineering supports capturing high amounts of user feedback in order to understand what users think about a software and which changes they are interested in. While much progress has been made in automatically classifying the feedback, it is less clear which conclusions can be drawn from feedback. Studies on utilization of feedback in industry report that companies are afraid that user feedback might be biased, because little is known about the users. In this paper, we report a preliminary empirical study with 100 participants where we asked users to give feedback on an app and also describe their comprehension of this app as an approximation of their opinion about the app as whole. We compare feedback and comprehension description by looking at the frequencies of mentioned features. We find that the feedback of frequent users (46 out of 100) differs only a little from the overall feedback. We confirm that feedback is biased in that users individually and as a group know more about the app (mentioned in their comprehension) than they tell in their feedback. However, the feedback does represent which features the users find important in their comprehension.
Keywords
- Prioritization, Software features, User comprehension, User feedback
ASJC Scopus subject areas
- Business, Management and Accounting(all)
- Organizational Behavior and Human Resource Management
- Computer Science(all)
- Software
- Engineering(all)
- Safety, Risk, Reliability and Quality
- Psychology(all)
- Developmental and Educational Psychology
- Social Sciences(all)
- Education
Cite this
- Standard
- Harvard
- Apa
- Vancouver
- BibTeX
- RIS
2023 IEEE 31st International Requirements Engineering Conference Workshops (REW). ed. / Kurt Schneider; Fabiano Dalpiaz; Jennifer Horkoff. Institute of Electrical and Electronics Engineers Inc., 2023. p. 122-128 (IEEE International Requirements Engineering Conference Workshops (REW)).
Research output: Chapter in book/report/conference proceeding › Conference contribution › Research › peer review
}
TY - GEN
T1 - What Can be Concluded from User Feedback? - An Empirical Study
AU - Anders, Michael
AU - Obaidi, Martin
AU - Specht, Alexander
AU - Paech, Barbara
N1 - Conference code: 31
PY - 2023
Y1 - 2023
N2 - Crowd-based Requirements Engineering supports capturing high amounts of user feedback in order to understand what users think about a software and which changes they are interested in. While much progress has been made in automatically classifying the feedback, it is less clear which conclusions can be drawn from feedback. Studies on utilization of feedback in industry report that companies are afraid that user feedback might be biased, because little is known about the users. In this paper, we report a preliminary empirical study with 100 participants where we asked users to give feedback on an app and also describe their comprehension of this app as an approximation of their opinion about the app as whole. We compare feedback and comprehension description by looking at the frequencies of mentioned features. We find that the feedback of frequent users (46 out of 100) differs only a little from the overall feedback. We confirm that feedback is biased in that users individually and as a group know more about the app (mentioned in their comprehension) than they tell in their feedback. However, the feedback does represent which features the users find important in their comprehension.
AB - Crowd-based Requirements Engineering supports capturing high amounts of user feedback in order to understand what users think about a software and which changes they are interested in. While much progress has been made in automatically classifying the feedback, it is less clear which conclusions can be drawn from feedback. Studies on utilization of feedback in industry report that companies are afraid that user feedback might be biased, because little is known about the users. In this paper, we report a preliminary empirical study with 100 participants where we asked users to give feedback on an app and also describe their comprehension of this app as an approximation of their opinion about the app as whole. We compare feedback and comprehension description by looking at the frequencies of mentioned features. We find that the feedback of frequent users (46 out of 100) differs only a little from the overall feedback. We confirm that feedback is biased in that users individually and as a group know more about the app (mentioned in their comprehension) than they tell in their feedback. However, the feedback does represent which features the users find important in their comprehension.
KW - Prioritization
KW - Software features
KW - User comprehension
KW - User feedback
UR - http://www.scopus.com/inward/record.url?scp=85174708604&partnerID=8YFLogxK
U2 - 10.1109/REW57809.2023.00027
DO - 10.1109/REW57809.2023.00027
M3 - Conference contribution
AN - SCOPUS:85174708604
SN - 979-8-3503-2692-5
T3 - IEEE International Requirements Engineering Conference Workshops (REW)
SP - 122
EP - 128
BT - 2023 IEEE 31st International Requirements Engineering Conference Workshops (REW)
A2 - Schneider, Kurt
A2 - Dalpiaz, Fabiano
A2 - Horkoff, Jennifer
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 31st IEEE International Requirements Engineering Conference Workshops, REW 2023
Y2 - 4 September 2023 through 5 September 2023
ER -