Details
Original language | English |
---|---|
Article number | 107682 |
Journal | Information and Software Technology |
Volume | 181 |
Early online date | 8 Feb 2025 |
Publication status | E-pub ahead of print - 8 Feb 2025 |
Abstract
Context: As the complexity of modern software is steadily growing, these systems become increasingly difficult to understand for their stakeholders. At the same time, opaque and artificially intelligent systems permeate a growing number of safety-critical areas, such as medicine and finance. As a result, explainability is becoming more important as a software quality aspect and non-functional requirement. Objective: Contemporary research has mainly focused on making artificial intelligence and its decision-making processes more understandable. However, explainability has also gained traction in recent requirements engineering research. This work aims to contribute to that body of research by providing a quality model for explainability as a software quality aspect. Quality models provide means and measures to specify and evaluate quality requirements. Method: In order to design a user-centered quality model for explainability, we conducted a literature review. Results: We identified ten fundamental aspects of explainability. Furthermore, we aggregated criteria and metrics to measure them as well as alternative means of evaluation in the form of heuristics and questionnaires. Conclusion: Our quality model and the related means of evaluation enable software engineers to develop and validate explainable systems in accordance with their explainability goals and intentions. This is achieved by offering a view from different angles at fundamental aspects of explainability and the related development goals. Thus, we provide a foundation that improves the management and verification of explainability requirements.
Keywords
- Explainability, Heuristics, Literature studies, Metrics, Quality models, Requirements engineering
ASJC Scopus subject areas
- Computer Science(all)
- Software
- Computer Science(all)
- Information Systems
- Computer Science(all)
- Computer Science Applications
Cite this
- Standard
- Harvard
- Apa
- Vancouver
- BibTeX
- RIS
In: Information and Software Technology, Vol. 181, 107682, 05.2025.
Research output: Contribution to journal › Article › Research › peer review
}
TY - JOUR
T1 - Exploring the means to measure explainability
T2 - Metrics, heuristics and questionnaires
AU - Deters, Hannah
AU - Droste, Jakob
AU - Obaidi, Martin
AU - Schneider, Kurt
N1 - Publisher Copyright: © 2025
PY - 2025/2/8
Y1 - 2025/2/8
N2 - Context: As the complexity of modern software is steadily growing, these systems become increasingly difficult to understand for their stakeholders. At the same time, opaque and artificially intelligent systems permeate a growing number of safety-critical areas, such as medicine and finance. As a result, explainability is becoming more important as a software quality aspect and non-functional requirement. Objective: Contemporary research has mainly focused on making artificial intelligence and its decision-making processes more understandable. However, explainability has also gained traction in recent requirements engineering research. This work aims to contribute to that body of research by providing a quality model for explainability as a software quality aspect. Quality models provide means and measures to specify and evaluate quality requirements. Method: In order to design a user-centered quality model for explainability, we conducted a literature review. Results: We identified ten fundamental aspects of explainability. Furthermore, we aggregated criteria and metrics to measure them as well as alternative means of evaluation in the form of heuristics and questionnaires. Conclusion: Our quality model and the related means of evaluation enable software engineers to develop and validate explainable systems in accordance with their explainability goals and intentions. This is achieved by offering a view from different angles at fundamental aspects of explainability and the related development goals. Thus, we provide a foundation that improves the management and verification of explainability requirements.
AB - Context: As the complexity of modern software is steadily growing, these systems become increasingly difficult to understand for their stakeholders. At the same time, opaque and artificially intelligent systems permeate a growing number of safety-critical areas, such as medicine and finance. As a result, explainability is becoming more important as a software quality aspect and non-functional requirement. Objective: Contemporary research has mainly focused on making artificial intelligence and its decision-making processes more understandable. However, explainability has also gained traction in recent requirements engineering research. This work aims to contribute to that body of research by providing a quality model for explainability as a software quality aspect. Quality models provide means and measures to specify and evaluate quality requirements. Method: In order to design a user-centered quality model for explainability, we conducted a literature review. Results: We identified ten fundamental aspects of explainability. Furthermore, we aggregated criteria and metrics to measure them as well as alternative means of evaluation in the form of heuristics and questionnaires. Conclusion: Our quality model and the related means of evaluation enable software engineers to develop and validate explainable systems in accordance with their explainability goals and intentions. This is achieved by offering a view from different angles at fundamental aspects of explainability and the related development goals. Thus, we provide a foundation that improves the management and verification of explainability requirements.
KW - Explainability
KW - Heuristics
KW - Literature studies
KW - Metrics
KW - Quality models
KW - Requirements engineering
UR - http://www.scopus.com/inward/record.url?scp=85217789947&partnerID=8YFLogxK
U2 - 10.1016/j.infsof.2025.107682
DO - 10.1016/j.infsof.2025.107682
M3 - Article
AN - SCOPUS:85217789947
VL - 181
JO - Information and Software Technology
JF - Information and Software Technology
SN - 0950-5849
M1 - 107682
ER -