GaussianNexus: Room-Scale Real-Time AR/VR Telepresence with Gaussian Splatting (Conditionally Accepted)

Published in The 38th Annual ACM Symposium on User Interface Software and Technology (UIST ’25), 2025

Recommended citation: Xincheng Huang, Dieter Frehlich, Ziyi Xia, Peyman Gholami, and Robert Xiao. 2025. GaussianNexus: Room-Scale Real-Time AR/VR Telepresence with Gaussian Splatting. In The 38th Annual ACM Symposium on User Interface Software and Technology (UIST ’25), September 28-October 1, 2025, Busan, Korea. ACM, New York, NY, USA, 17 pages. https://doi.org/10.1145/3746059.3747693

Abstract: Telepresence systems with AR/VR immerse a remote user in a local physical environment, enabling virtual travel, remote guidance, and collaborative design. Contemporary systems typically rely on 360° video or RGB-D reconstruction – each with trade-offs between visual fidelity and spatial perception. Emerging rendering techniques like Gaussian Splatting unify these strengths, offering photo-realistic scene representations with spatial interactivity. However, due to the long training times required, updating such scenes in real-time is still largely infeasible. We present GaussianNexus, a system that applies Gaussian Splatting to room-scale telepresence. Our system uses Gaussian Splatting as the primary scene representation medium, and a 360° camera to stream and track 2D and 3D dynamic changes. For live 2D interaction, the system overlays rectified video onto user-selected surfaces. For live 3D interaction, users identify dynamic objects in the environment, which are then segmented, tracked and synchronized as real-time updates to the Gaussian Splatting environment, enabling smooth, low-latency telepresence without retraining. We demonstrate the utility of GaussianNexus through 2 example applications and evaluate it in a usability test.

Paper: Coming Soon :-)

Demo: