Metaphor as an Explanation Tool
With the help of natural language processing, the project explores how metaphors function and are used within explanations to contribute to the future of explainable AI systems.
In this project, we study using natural language processing how explainers and explainees focus attention, through their choice of metaphors, on some aspects of the explanandum and draw attention away from others. In particular, this project focuses on the metaphorical space established by different metaphors for one and the same concept. We seek to understand how metaphors foster (and impede) understanding through highlighting and hiding. Moreover, we aim to establish knowledge about when and how metaphors are used and adapted in explanatory dialogues; as well as how explainee, explainer, and the topical domain of the explanandum contribute to this process. By providing an understanding of how metaphorical explanations function and of how metaphor use responds to and changes contextual factors, we will contribute to the development of co-constructive explaining AI systems.
DFG, Transregional collaborative research center TRR 318
- Paderborn University (project coordinator)