Human-Computer Interaction, Tangible and Gestural Interaction
Thanks to the opportunity to be associated within the RUD project consortium, I wished to pursue the exploration of the concept of semi-automatic interaction, introduced by P. Reuter et al in 2010. The idea is to go beyond considering AI as a black box and accepting its output as it. We will, on the contrary, refine and adjust the output with the user. I proposed a research subject, conducted by Dr Samory Houzangbe, that explores an approach combining AI algorithms with user experience design methods. This approach is called “Human-Centered AI” in which Artificial Intelligence (AI) algorithms are combined with human-centered thinking to make Human-Centered AI (HCAI) [concept introduced by Ben Shneiderman in 2021 in ACM IUI Conf.]. More specifically, we will explore a sub domain of HCAI, called “interactive machine learning” (IML). We want to refine the IA mathematical model through iterative cycles, based on experiments on RUD industrial partners use cases.