Team Projects

STIC-AMSUD Program | 2011 - 2012

Project leader: Sylvie Pesty

Recent studies from psychology and neurology sciences have addressing the importance of emotions and other affective states for several intelligent cognitive processes, such as memory, decision taking and, inclusive, learning. These results have drawn the attention of Computers in Education researchers about the importance of taking into account students’ emotions in intelligent learning environments (ILE). In order to express emotions, the ILEs are generally incorporated with Embodied Conversational Agents (ECA). ECAs are computer-generated characters that are able to demonstrate many of the same properties as humans in face-to-face interaction, including the ability to produce and respond to verbal and nonverbal communication. As they are able to have a more anthropomorphic interaction with user, these agents have shown several benefits when included in ILEs, such as to engage students, focus their attention in important aspects of learning, demonstrate tasks, and so on. To be affective-aware, ECAs should have mechanisms in order to infer and express believable emotions. Although some results have already been acquired, real environments of interaction, as learning systems, require combining different methods and techniques with the purpose of obtaining more powerful inference mechanisms and generating more realistic verbal (text and speech) and non-verbal expression of emotions. This project aims at combining the Brazilian, Argentine and French investigation teams expertise in Artificial Intelligence applied to education and Affective Computing in order to create more affective-aware ECA and study its potential in real learning applications.

Finance ANR | 2008 - 2012

Project leader: Sylvie Pesty

This project concerns the interaction systems of new generation where the human user is at the core of the interaction. These systems are designed to be believable (i.e. not only trustworthy and honest, but also capable to give an illusion of life). Several studies show that the conception of this kind of systems can be realized only by integrating an advanced processing of emotions in the system. This is in order to endow the system with the capabilities to understand and to adapt to the user's emotions, to reason about the user's emotions, to plan its actions anticipating their effects on the user's emotions, and to express its emotions. These are necessary prerequisite to make the system capable to interact with the user in a natural way. The subject of emotions has been debated in the last twenty, thirty years in several disciples such as psychology, philosophy, cognitive sciences and economics. More recently, computer sciences have started to focus on emotions. Several computational models of the role of emotions in cognition and on the expression of affective contents have been proposed (e.g. the project on Affective Computing at MIT or the Kansei Information Processing in Japan). These researches have been at the basis of the development of several prototypes such as Embodied Conversational Agents (ECAs) to be used in different services (e.g. game platforms, simulators, tutoring agents, robotic assistants etc.)