Plataforma Abierta para Robótica Asistencial Social – Open Platform for Socially Assistive Robotics
|Financial entity:||Ministerio de Ciencia e Innovación|
|Principal Investigator:||Bustos García de Catro, Pablo|
OPSAR is presented as a subproject coordinated with the THERAPIST project.
The main objective of OPSAR is to provide a mobile robot with the perceptual and cognitive abilities that allow to have at our disposal, within the constraints of the current technology, a social assistive robot. To achieve this goal, the robot should include, not only the most basic functionalities of a mobile agent (navigation, control of movements or low-level sensory capacities), but also an important set of advanced and interrelated software and hardware components.
At a second level, the robot must have different social skills (detection and monitoring of people, recognition of emotional states or behavior/activity identification) through the integration of channels of natural and intuitive communication such as conversational modules, facial expressions or body movements. The robot must be able to carry out certain pre-programmed tasks, but also to learn new tasks based on observation and imitation. To achieve these social behaviors that allow for effective and friendly human-robot interaction, the OPSAR project proposes the definition of an open hardware/software platform for assistive social robotics. This application domain of social robotics is characterized by the use of a type of therapeutic interaction, social, non-physical and multimodal.
The platform will consist of a mobile manipulator with two arms and an expressive head, characterized by its low cost and robustness, and a model-driven software development framework with quality of services. Both will be built following a policy of open source software and hardware. The robot, which we call Therapist, will be based on previous developments such as the robots Robex, Ursus, SMAR (U. Extremadura) and Nomada (U. Málaga) and will be optimized for therapeutic and support tasks uses. The framework will be based on the RoboComp middleware, developed by UEX, incorporating a number of new techniques and tools not currently available. Specifically, we will research into model-driven design tools that allow to transform high-level designs into source code or formal designs for verification.
Basic functionalities for navigation, manipulation and affective expressiveness will be designed over this framework. To develop this new set of functionalities, we will start from previous experience in SLAM (UMA, Citic, UEX), high-level modeling using laser and stereoscopic vision (UEX), local navigation and path planning (UEX), therapeutic training with manipulators (UEX, HU Virgen del Rocío) and the expressive head prototype Muecas (UEX). The contributions of this project will result in a mobile robot with a basic set of skills that will be extended by the other technological subprojects.
This robot will perform an innovative robotic program of assistive training that will be experimentally validated at the HUVR. This program poses as an scientific challenge the beneficial use of robots in neurorehabilitation therapies. Three cases of use will be considered: patients with head trauma, patients with motor deficits in the upper limbs and children with communication difficulties due to cerebral palsy.