Prototipo de robot de seguimiento a técnicos para asistencia en un centro sociosanitario (SHADOW)
Funding Entity: Ministry of Science and Innovation, 2022 call for PROOF OF CONCEPT PROJECTS (PDC), funded by the State Research Agency (AEI), Government of Spain.
SHADOW is a coordinated proof-of-concept project that aims to develop a mobile, autonomous and socially aware robot designed to assist health care professionals during their work shifts. This robot combines advanced perception technologies, automated planning and multimodal interaction to follow a technician or caregiver, accompany them during their day, and execute small missions under their supervision.
The project was born as a continuation of the previous work in the FACILE project and is based on robust developments such as the cognitive architecture CORTEX and the planning framework MLARAS. SHADOW advances towards the real validation of these technologies, designing a functional, low-cost robot, manufactured by 3D printing, and implemented in real scenarios, such as a social and health center in Cáceres (Aztide).
Coordinated by the University of Extremadura, the consortium has the participation of the University of Malaga, the University Carlos III of Madrid and the University of Jaen, integrating knowledge in cognitive robotics, perception, planning and human-robot interaction.
SUBPROJECT 1
Design under functional constraints of a socially AWare robot (DAW)
Reference: PDC2022-133141-C21
This subproject leads the conception, design and construction of the physical robot, seeking the highest functionality at the lowest possible cost. The platform will be completely 3D printed, with affordable electronic components and ergonomic design adapted to the healthcare environment. The design process is collaborative, including professional caregivers in the decision making process to ensure a good acceptance of the robot. In addition, this subproject is responsible for integrating and adapting the CORTEXarchitecture, implementing key capabilities such as socially aware navigation, reliable technician tracking, failure recovery and multimodal communication. DAW represents the physical and integrative foundation of the SHADOW robotic system.
Principal Coordinator
Principal Investigator
SUBPROJECT 2
Perceptual system for a socially AWare robot (PAW)
Reference: PDC2022-133141-C22
PAW develops the robot's perceptual systems , which are essential for the robot to safely detect, identify and track a specific person, even in dynamic and crowded environments. Technologies such as deep neural networks, real-time local maps, visual-inertial odometry and predictive planning are employed, all adjusted to social navigation rules. The main challenge is to achieve a system capable of recognizing the leader among several people, adapting to moving obstacles and maintaining an adequate interpersonal distance. In addition, the robot will be able to perform intelligent behaviors such as searching for the leader if it loses it, or returning to its charging station. This sub-project is the robot's “eye” and “instinct”.
Principal Investigators
SUBPROJECT 3
Automated Planning for a socially AWare robot (APAW)
Reference: PDC2022-133141-C23
APAW provides the deliberative component of the robot, allowing it to plan and execute complex actions based on verbal or contextual requests from the user. It uses the MLARASframework, a hierarchical architecture for automated planning, along with a graphical interface that facilitates the creation of planning models without the need for advanced coding. This capability allows the robot to understand instructions such as “wait here” or “accompany this person”, evaluate the environment and adapt its behavior to new or unexpected situations. APAW provides the strategic intelligence that gives the robot real autonomy.
Principal Investigators
SUBPROJECT 4
Support Services for a socially AWare robot (SAW)
Reference: PDC2022-133141-C24
SAW is in charge of developing the robot's interaction and support services , focusing on accessibility, usability and suitability for users with different cognitive or sensory abilities. Multimodal interfaces are implemented, such as voice recognition, touch screens and gestures, allowing a fluid and natural interaction. Work is also being done on integration with external services, such as video calls, browsers and environmental sensors. This subproject ensures that the experience of using the robot is comfortable, intuitive and safe for both professionals and patients in the healthcare environment.
Principal Inestigator