作者: Lijie Guo , Elizabeth M Daly , Oznur Alkan , Massimiliano Mattetti , Owen Cornec
DOI:
关键词:
摘要: Machine learning technologies are increasingly being applied in many different domains in the real world. As autonomous machines and black-box algorithms begin making decisions previously entrusted to humans, great academic and public interest has been spurred to provide explanations that allow users to understand the decision-making process of the machine learning model. Besides explanations, Interactive Machine Learning (IML) seeks to leverage user feedback to iterate on an ML solution to correct errors and align decisions with those of the users. Despite the rise in explainable AI (XAI) and Interactive Machine Learning (IML) research, the links between interactivity, explanations, and trust have not been comprehensively studied in the machine learning literature. Thus, in this study, we develop and evaluate an explanation-driven interactive machine learning (XIML) system with the Tic-Tac-Toe game …