Jaime Andre Rincon was born in Buga (Valle), Colombia, in 1978. He received the B.E. degree in biomedical engineering from the University Manuela Beltran, Colombia, in 2008, and the MS degree in Artificial Intelligence from Universitat Politècnica de València. He is now a PhD student on Computer Sciences at the Universitat Politècnica de València. As a researcher his interest is in multi-agent systems, robotics and emotional agents.
Jaime Andres Rincon Arango (LAB 205)
Universitat Politècnica de València
Departamento de Sistemas Informáticos y Computación,
Camino de Vera s/n Valencia, CP 46022, Spain
jarincon@dsic.upv.es
In recent years’ great advances in robotics and artificial intelligence have been observed. These advances have allowed the creation of different applications, in which their integration is sought to be introduced in our daily lives. To do this, it is necessary to conduct research, in which robots and AI understand and learn from us. One of the elements that characterize humans are the emotions, they control a large part of our lives. This project aims to introduce emotional models to AI and robotic entities in order to learn and understand our emotions. But not only at the individual level, but also at a social level. That is, to be able to determine the social emotion of an agent human society.
With the arrival of the new millennium as human beings, we have had to face changes in our environment and the way we communicate. The latest technology devices with which we can communicate have sensors that allow us to monitor our physiological signals and assess our health condition. Thanks to devices like smartwatch we can access signals like oxygen saturation (SPO2) or heart rate. However, these devices typically lack of sensors that capture signals such as the GSR, used to measure stress levels or ECG, which would allow us to determine mild heart problems such as arrhythmias or tachycardia, and so on. In the same way, these new devices allow us to be more interconnected and this connection opens the doors to different applications, especially health applications. We can use the new communication networks to send a large flow of data to servers for further analysis. In this way, we can create, for example, applications aimed at the field of telemedicine to control the monitoring of patients from their homes. Recently, the detection of emotions and emotional states has aroused great interest in the scientific community. Most people think that diseases are uncontrollable and that emotional states do not affect our health. The truth is that the body responds with symptoms to most of the things that occur to us and, depending on how we react to adversity, we can accelerate or complicate our healing processes. Our body is completely connected and poor emotional health can affect our body even weakening our immune system. Stress, anxiety or worry affect the way we eat or sleep. With a bad diet our body does not receive enough nutrients for its maintenance and if we sleep badly, our body does not rest enough. So the aim of this project is to detect emotional states in order to improve the quality of life of people. We intend to use bio signals such as ECG, GSR, PPG and temperature captured to classify the emotional states of the users. The intention is to classify emotions and use it as a tool to anticipate emotional changes. In this way, we could modify the environment in which the user is or even make a social interaction with the individual, in case he is in a smart house. Environmental modifications could be introduced using an artificial intelligence tool in order to communicate and recommend or even persuade to the user based on emotions.
Globally, the elderly population is increasing, according to demographic projections. According to the OMS, the amount of people aged over 60 is expected to double between 2000 and 2050. As less developed countries start to evolve, this trend is onset immediately. A common societal issue that emerges from a rapid elderly population growth is the exponential demand or care (medical and otherwise). EmIR (Emotional Intelligent Robot), which has been developed to provide assistance to the elderly performing their daily activities. The main features of the robot are the following: perceives the emotions of each individual and also calculates the social emotion of a group of people; displays human identifiable emotions according to the emotional states of the users; assists users recommending healthy activities and also, remembering when to do them; and finally, persuades users by giving them arguments to justify recommended actions. The functionalities of the proposed robot are divided into three main modules which provides the services that our robot uses to perceive and interact with the environment (see Figure 1). The first two modules are the Emotion Detection and the Emotion Display modules which are in charge of detecting, processing and displaying emotions.
Modular robots are robots mainly characterized by their ability to reconfigure their modules and changing their shape. Each module of a robot is an independent entity that can be joined to other modules. This feature allows each robot to adapt its shape dynamically to changes in the environment. Currently, a wide range of domains of application are using modular robotics. For example, they are being used to search for missing persons in earthquakes and to carry out space exploration. These domains need advanced virtual simulation environments like the ones MAM5 and JaCalIVE allow to test their implementations. Moreover, simulations as the one presented in this paper can be used as testbeds for new adaptive algorithms, cooperative algorithms, Swarm Robotics, and so on.