SARDAM: Service Assistant Robot for Daily Activity Monitoring
Ver/ abrir
Use este enlace para citar
http://hdl.handle.net/2183/26394
A non ser que se indique outra cousa, a licenza do ítem descríbese como Atribución 4.0 Internacional
Coleccións
- CIT-GII - Artigos [39]
Metadatos
Mostrar o rexistro completo do ítemTítulo
SARDAM: Service Assistant Robot for Daily Activity MonitoringData
2020Cita bibliográfica
Lamas, C.M.; Bellas, F.; Guijarro-Berdiñas, B. SARDAM: Service Assistant Robot for Daily Activity Monitoring. Proceedings 2020, 54, 3. https://doi.org/10.3390/proceedings2020054003
Resumo
[Abstract] In this work, we propose an autonomous monitoring system for the daily routine of an elderly person. SARDAM (Service Assistant Robot for Daily Activity Monitoring), which is the name of this system, uses a humanoid robot as a key element that carries out a direct interaction with the user. The purpose of SARDAM is to keep the user active as long as possible by suggesting, and monitoring, a series of daily tasks and healthy habits according to the prescription of a health professional, in order to reduce the early appearance of cognitive and motor impairment. In the current version of SARDAM we use the NAO humanoid robot, which performs a natural interaction with the user through vision and speech libraries. To assure the appropriate execution of the user’s daily tasks, a module for emotion detection has been incorporated in order to propose corrective tasks according to the detected emotion. SARDAM was tested in a scenario with a real user, getting
successful results and positive opinions from them that encourage further work.
Palabras chave
Socially assistive robotics
Artificial intelligence
NAO robot
Human-robot interaction
Emotion detection
Object detection
Speech recognition
Artificial intelligence
NAO robot
Human-robot interaction
Emotion detection
Object detection
Speech recognition
Descrición
This article belongs to the Proceedings of 3rd XoveTIC Conference
Versión do editor
Dereitos
Atribución 4.0 Internacional
ISSN
2504-3900