How Can We Develop Explainable Systems? Insights from a Literature Review and an Interview Study
Quality aspects such as ethics, fairness, and transparency have been proven to be essential for trustworthy software systems. Explainability has been identified not only as a means to achieve all these three aspects in systems, but also as a way to foster users' sentiments of trust. Despite this, research has only marginally focused on the activities and practices to develop explainable systems. To close this gap, we recommend six core activities and associated practices for the development of explainable systems based on the results of a literature review and an interview study. First, we identified and summarized activities and corresponding practices in the literature. To complement these findings, we conducted interviews with 19 industry professionals who provided recommendations for the development process of explainable systems and reviewed the activities and practices based on their expertise and knowledge. We compared and combined the findings of the interviews and the literature review to recommend the activities and assess their applicability in industry. Our findings demonstrate that the activities and practices are not only feasible, but can also be integrated in different development processes.
READ FULL TEXT