Bandeau image

The «Bad Nudge - Bad Robot?» project

The «Bad Nudge - Bad Robot?» project

  • The project
  • AI Chair HUMAAINE context
  • The team
  • Contacts
Chapo
Connected objects and specifically conversational agents such as Google Home, bring a new dimension to interaction, namely speech, and could become a means of influencing individuals. They are currently neither regulated, nor evaluated and very opaque.
Contenu
Ancre
The project
Corps de texte

Based on the study of "nudges", techniques to modify people's behaviour, Laurence Devillers' team "Emotional and social dimensions in spoken interactions" with Ioana Vasilescu, Gilles Adda of LIMSI - CNRS, and the "Digital Economy" team of the RITM laboratory; Grazia Cecere, Fabrice Le Guel and Serge Pajak, Université Paris-Sud, decided to collaborate to highlight the importance of ethics in creating these objects.

 

Nudging: a new concept 

Nobel Prize winner in economics, the American Richard Thaler highlighted in 2008 the concept of nudge, a technique that consists in encouraging individuals to change their behaviour without coercion and by using their cognitive biases. Laurence Devillers, who works on nudging at the international level within IEEE, the world's largest association whose main objective is to develop technology for the benefit of humanity, and Serge Pajak, an economist specializing in innovation economics, began working together to develop the Transalgo platform before deciding to set up this project dedicated to the study of nudges in human-machine verbal interaction. "In behavioural economics, the voice dimension of interactions has never been studied before," says the team of economists.  

 

Raise awareness of the danger of nudges

With the development of connected objects, nudges are everywhere without the user noticing. Apple's iMessage application, for example, frames a message in blue when it is exchanged with another Apple user, and in green when it is exchanged with a "foreign" user. "In behavioural economics, we know very well that blue is a much more pleasant colour than green," explains the team of economists. "If they are often used for health purposes, for example, the use of voice-assisted objects could amplify these manipulative phenomena if they are used for commercial purposes, with less ethics," says Laurence Devillers, who has conducted numerous experiments with voice interactions between elderly people and empathetic robots in the PSPC ROMEO2 and CHIST-ERA JOKER projects (more information). The Bad Nudge - Bad Robots project aims to highlight the danger that these techniques can represent for vulnerable people such as children or the elderly.

 

The importance of ethics

In concrete terms, the team will set up experiments in the form of vocal interactions with a robot capable of nudges with several types of more or less vulnerable populations in order to develop tools for evaluating nudges to show their impact. At the scale of their laboratory, then in the field, the two teams will study whether fragile people are more sensitive to nudges.

This research axis is innovative, it is important to understand the impact of these new tools in society and to focus this topic on ethics and manipulation by machines internationally. "The objects will address us by talking to us. It is necessary to better understand the relationship to these talkative objects without consciousness, emotions and intentions of their own. Users today are not aware of how these systems work, they tend to anthropomorphize them. To avoid these confusions between life and artifacts, designers must provide more transparency and explanation of machine capabilities," explains Laurence Devillers.  The ultimate objective of the project is to create "ethic-by-design" objects and to reflect on an international dynamic on this subject. For this research project, two theses, one in computer science and the other in economics, will be supervised in parallel, on the one hand by Laurence Devillers, Ioana Vasilescu and Gilles Adda and on the other by Grazia Cecere, Fabrice Le Guel and Serge Pajak.
 
The evaluation of the results of the experiments of this project is to produce measures for the monitoring of these tools as well as economic (in terms of regulation), ethical and legal recommendations for public decision-makers. The subject of nudges has not yet been the subject of any cross-cutting legal analysis. Alexandra Bensamoun and Julie Groffe from CERDI will provide their expertise on legal aspects. Testing the capacity for empowerment (encapacitation) also seems fundamental at a time when the supervisory and regulatory authorities will not have sufficient resources to ensure ethical behaviour of the many connected objects and robots that will arrive in homes. It is also particularly important to think about ways of protecting the most vulnerable populations while ensuring the economic development of the ICT sector in Europe.

According to Laurence Devillers, "it is urgent to start working on these ethical issues, in this respect DATAIA is an institute of excellence, the first in France to highlight them."

Ancre
AI Chair HUMAAINE context
Corps de texte

The «Bad Nudge-Bad Robot?» project is part of the AI Chair «HUMAAINE», directed by Laurence Devillers, that will start on September 2020.

Ancre
The team
Corps de texte

equipe

Ancre
Contacts