IRIS: An Immersive Robot Interaction System
Abstract
This paper introduces IRIS, an Immersive Robot Interaction System leveraging Extended Reality (XR). Existing XR-based systems enable efficient data collection but are often challenging to reproduce and reuse due to their specificity to particular robots, objects, simulators, and environments. IRIS addresses these issues by supporting immersive interaction and data collection across diverse simulators and real-world scenarios. It visualizes arbitrary rigid and deformable objects, robots from simulation, and integrates real-time sensor-generated point clouds for real-world applications. Additionally, IRIS enhances collaborative capabilities by enabling multiple users to simultaneously interact within the same virtual scene. Extensive experiments demonstrate that IRIS offers efficient and intuitive data collection in both simulated and real-world settings.
Video Presentations
Data Collection in Different Framework and Simulator
Libero
Meta-World
Robocasa
Deformable
Data Collection for Different Embodiment
Aloha
Humanoid
More Options
More Applications
Collaborative
RL Agent
Real Robot
BibTeX
@inproceedings{jiang2025iris,
title={IRIS: An Immersive Robot Interaction System},
author={Jiang, Xinkai and Yuan, Qihao and Dincer, Enes Ulas and Zhou, Hongyi and Li, Ge and Li, Xueyin and Haag, Julius and Schreiber, Nicolas and Li, Kailai and Neumann, Gerhard and others},
booktitle={9th Annual Conference on Robot Learning},
year={2025}
}