« Bad Nudge - Bad Robot ? » project
Connected objects, and specifically conversational agents like Google Home, bring a new dimension to interaction, namely speech, and could become a means of influencing individuals.
For the moment, they are neither regulated nor evaluated, and are highly opaque. Based on the study of “nudges”, techniques for modifying people's behavior, Laurence Devillers' team “Affective and social dimensions in spoken interactions” with Ioana Vasilescu, Gilles Adda from LIMSI - CNRS, and the “Digital economy” team from the RITM laboratory; Grazia Cecere, Fabrice Le Guel and Serge Pajak, Université Paris-Sud, decided to collaborate to highlight the importance of ethics in the creation of these objects.
Nudging: a new concept
In 2008, Richard Thaler, winner of the Nobel Prize in Economics, brought to light the concept of nudging, a technique that consists of encouraging individuals to change their behavior without coercion, by using their cognitive biases. Laurence Devillers, who works on nudging at an international level within IEEE, the world's largest association whose main objective is to develop technology for the benefit of humanity, and Serge Pajak, an economist specializing in the economics of innovation, began working together as part of the development of the Transalgo platform before deciding to set up this project dedicated to the study of nudges in human-machine verbal interaction. “In behavioral economics, the vocal dimension of interactions has never before been studied”, the team of economists points out.
Raising awareness of the dangers of nudges
With the development of connected objects, nudges are everywhere, without the user even realizing it. Apple's iMessage application, for example, frames a message in blue when exchanged with another Apple user, and in green when exchanged with a “foreign” user. “In behavioral economics, we know very well that blue is a much more pleasant color than green,” explains the team of economists. “If they are often used to good effect for health, for example, the use of voice-assisted objects could amplify these manipulative phenomena if they are used for commercial purposes, with less ethics” states Laurence Devillers, who has conducted numerous voice interaction experiments between elderly people and empathetic robots in the PSPC ROMEO2 and CHIST-ERA JOKER projects (find out more). The Bad Nudge - Bad Robots project aims to highlight the danger that these techniques can represent for vulnerable people such as children and the elderly.
The importance of ethics
In concrete terms, the team will set up experiments in the form of vocal interactions with a robot capable of nudging several types of more or less vulnerable population, in order to develop nudge evaluation tools to show their impact. At laboratory level, and then in the field, the two teams will study whether fragile people are more sensitive to nudges. This is an innovative line of research: it's important to understand the impact of these new tools on society, and to take this subject of ethics and manipulation by machines international. “Objects will talk to us. We need to better understand our relationship with these chatty objects, which have no conscience, no emotions and no intentions of their own. Today's users are unaware of how these systems work, and tend to anthropomorphize them. To avoid this confusion between living beings and artifacts, designers need to be more transparent and explain the capabilities of machines,” explains Laurence Devillers. The ultimate aim of the project is to create “ethic-by-design” objects and to reflect on an international dynamic on this subject. For this research project, two theses - one in computer science, the other in economics - will be supervised in parallel, on the one hand by Laurence Devillers, Ioana Vasilescu and Gilles Adda, and on the other by Grazia Cecere,
The results of this project will be used to produce measures for monitoring these tools, as well as economic (in terms of regulation), ethical and legal recommendations for public decision-makers. The subject of nudges has not yet been the subject of any transversal legal analysis. Alexandra Bensamoun and Julie Groffe from CERDI will provide their expertise on the legal aspects. Testing the capacity for empowerment (encapacitation) also seems fundamental at a time when surveillance and regulatory authorities will not have sufficient means to ensure the ethical behavior of the many connected objects and robots that will be arriving in the home. In particular, we need to think about how to protect the most vulnerable populations, while ensuring the economic development of the ICT sector in Europe. According to Laurence Devillers, “It's urgent to start working on these ethical issues, and DATAIA is an institute of excellence, the first in France to bring them to the fore”.
The “Bad Nudge - Bad Robot?” project was part of the context linked to the AI HUMAAINE Chair, headed by Laurence Devillers, which started in September 2020.
Contacts : Laurence Devillers | Serge Pajak