Exploring Personalized Actionable Explanations for Contestability

Mentor: Mireia Yurrita Semperena, m.yurritasemperena@tudelft.nl

Background: Human-AI decision making is becoming increasingly ubiquitous, and explanations have been proposed to make algorithmic decisions understandable. Explanations are also central for decision subjects to exercise their right to contest algorithmic decisions. So far, most research in Explainable AI has suggested one-size-fits-all technical explanations that are limited to algorithmic outcomes. In contrast, decision subjects might not only want to contest the algorithmic decision but also more fundamental issues like the goal of the system or automation itself. Contestability, therefore, requires explanations that go beyond algorithmic outcomes and also capture the rationales that led to the development and deployment of the algorithmic system.

Goal: This graduation project aims at exploring how to design personalized actionable explanations
for contestability. Personalization aims at catering explanations to decision subjects’ AI literacy
levels. Actionability ensures that decision subjects can act on such explanations. This
exploration might involve exploring the design of explanations with varying levels of detail,
modality (i.e., audio vs. visual), or paradigm (i.e., textual, graphical, interactive) and evaluating
their effectiveness on enabling contestability.
 
Desired Skills: Basic understanding of Machine Learning (for instance, having taken the Machine
Learning for Designers course). Illustration abilities. Basic coding abilities. Willingness to conduct qualitative / quantitative studies for evaluating the designs.