IRIS: An Immersive Robot Interaction System

Xinkai Jiang1,*, Qihao Yuan2, Enes Ulas Dincer1, Hongyi Zhou1, Ge Li1, Xueyin Li1, Xiaogang Jia1, Timo Schnizer1, Nicolas Schreiber1, Weiran Liao1, Julius Haag1, Kailai Li2, Gerhard Neumann1, Rudolf Lioutikov1,
1 Karlsruhe Institute of Technology 2 University of Groningen
CoRL 2025

Abstract

This paper introduces IRIS, an Immersive Robot Interaction System leveraging Extended Reality (XR). Existing XR-based systems enable efficient data collection but are often challenging to reproduce and reuse due to their specificity to particular robots, objects, simulators, and environments. IRIS addresses these issues by supporting immersive interaction and data collection across diverse simulators and real-world scenarios. It visualizes arbitrary rigid and deformable objects, robots from simulation, and integrates real-time sensor-generated point clouds for real-world applications. Additionally, IRIS enhances collaborative capabilities by enabling multiple users to simultaneously interact within the same virtual scene. Extensive experiments demonstrate that IRIS offers efficient and intuitive data collection in both simulated and real-world settings.

Video Presentations

Data Collection in Different Framework and Simulator

Libero

Meta-World

Robocasa

Deformable

Data Collection for Different Embodiment

Aloha

Humanoid

More Options

More Applications

Collaborative

RL Agent

Real Robot

BibTeX

@inproceedings{jiang2025iris,
  title={IRIS: An Immersive Robot Interaction System},
  author={Jiang, Xinkai and Yuan, Qihao and Dincer, Enes Ulas and Zhou, Hongyi and Li, Ge and Li, Xueyin and Haag, Julius and Schreiber, Nicolas and Li, Kailai and Neumann, Gerhard and others},
  booktitle={9th Annual Conference on Robot Learning},
  year={2025}
}