The future of robotics is with systems that can naturally work with humans. Certainly, wearable robotics and human augmentation are fields where this seamless connection is fundamental, given that the robot is attached to the user. From making and fitting a prosthesis to creating adaptable controllers, my work involves considering the user's needs, and the way they interact with the devices.
Besides that work, I'm interested in exploring emotional robotics concepts to create better interactive experiences with robots. I have focused on creating robotic assistants using emotion-based approaches that could help treat body focus repetitive behaviors. As part of a beta-developing experience with Anki for Dr. Sonia Chernova's HRI course, I implemented behaviors based on pattern recognition for the robotic toy Cozmo. My work, presented at the Computer and Robot Vision Conference (CRV) uses a two-axis temperament scale that describes the emotion and triggers reactions according to the robot's state as an x-y coordinate system. We can even apply these ideas to agents that do not have physical embodiment, for example, in wearable UX systems as we did for smartwatches.
The UX in robotics is an exciting research area and one that I'm very passionate about. For Spring 2021, I was recruited as a Ph.D. intern at Facebook Reality Labs. During six months, I will be using the expertise I have gained in machine learning and biomechanics to evaluate and improve hand tracking algorithms for applications in AR/VR systems. This experience will open more research possibilities, combining the understanding of human motion with robotic manipulation, exploring the user perception, developing solutions in real, virtual, or mixed settings.