< Back to previous page
Project
Mixed-initiative explanation methods: towards the next generation of interactive machine learning steered with rich feedback of non-expert users
The overall objective of this project is to enable non-expert users to interact with ML models as a basis to improve the accuracy of such models and to increase user trust. As individual users have different needs, the long-term goal of our research is to personalise mixed-initiative explanation interfaces to the specific needs, characteristics, and context of each user.
Date:6 Sep 2021 → Today
Keywords:Explainability, Machine learning, Artificial intelligence, Mixed-initiative, Interactive machine learning
Disciplines:Machine learning and decision making, Human-computer interaction
Project type:PhD project