XR Robotics
XR interfaces and immersive human-in-loop simulation for Robotics
In the future, how will we work with robots and intelligent and autonomous machines? How will we communicate with them and control them? How can we easily understand their intentions, beliefs and actions? How will we teach them to understand us? How can we test and validate them before deploying in the field?
Here at IXLAB we use immersive mixed-reality simulations to address such questions.
To begin, robots and autonomous machines will be trained and tested in virtual worlds. Using mixed reality interfaces, real humans can be immersed in the virtual simulation, allowing "human-in-the-loop" machine learning, prototyping and testing all to take place safely. Once validated, elements of the real world can then be added to the simulation, progressively transitioning from a virtual robot in a virtual simulation to a real robot in the real world.
MR interfaces also provide an ideal interface for communicating with and controlling robots, virtual and real. Telepresence allows human operators to see "through the eyes" of a robot, while immersive visualisations allow easy display how the robot perceives and understands its environment.
Partners
Prof Aßmann, Chair for Software Technology, TU Dresden
Student Projects
Mixed Reality for Explainability of Robotic Systems, Leopold Piribauer, 2024