Visual SLAM is Oculus’ solution to room-scale VR.

Ken Wang
4 min readMay 17, 2019

Update 09/14/2019

Facebook recently released a technical blog on Oculus Insight using visual-inertial SLAM which confirmed the analysis of this article including my prediction that IMU is used as part of the “inertial” system. In addition, in 2016, Facebook detailed its first generation of the SLAM system with direct reference to ORB-SLAM, SVO, and LSD SLAM. Both articles are great technical pieces to read with system architecture diagrams and solutions to the SLAM accuracy and efficiency problem on the embedded device.

The HTC Solution

Room-scale VR has been a work-in-progress ever since manufacturers try to come up with different hardware and wires to allow the VR system to track the user within a room.

To support a fully immersive room-scale VR experience, HTC VIVE has to bundle the lighthouse base stations along with the headset to accurately report the headset position and orientation. The system also tracks the motion of the VIVE Controllers typically placed in user’s hands and VIVE Trackers that can be attached to any physical accessories or controllers. The system works exceptionally well after everything is set up, and it is pretty accurate, but it requires wires, and tracking is always limited to the physical range covered by the lighthouse stations.

--

--

Ken Wang
Ken Wang

Written by Ken Wang

I am building robotics, deep learning and SLAM solutions with support for large scale simulation, training, and testing.