A team of Google researchers has created a new technology to take the immersive augmented reality (AR) and virtual reality (VR) experiences to a new level. New research shows the ability to record, reconstruct, compress, and deliver high-quality immersive light field videos, lightweight enough to be streamed over regular Wi-Fi.
"We're making this technology practical, bringing us closer to delivering a truly immersive experience to more consumer devices," said Michael Broxton, Google research scientist. Photos and videos play a huge role in our day-to-day experience on mobile devices, and "we are hoping that someday immersive light field images and videos will play an equally important role in future AR and VR platforms," he added.
Wide field of view scenes can be recorded and played back with the ability to move around within the video after it has been captured, revealing new perspectives. In recent years, the immersive AR/VR field has captured mainstream attention for its promise to give people a truly authentic experience in a simulated environment.
Although the field is still nascent, the team at Google has addressed important challenges, making major research headway in immersive light field video. Another breakthrough in this work involves data compression.
The idea is not only to develop a system capable of reconstructing video for a truly immersive AR/VR experience but also to access the experience via consumer AR and VR headsets and displays, and even in a web browser. The new system compresses light field video while still preserving its original visual quality and it does so using conventional texture atlasing and widely supported video codecs.
In essence, the Google researchers have succeeded at bootstrapping a next-generation media format off of today's image and video compression techniques.
"Users will be able to stream this light field video content over a typical, fast-speed internet connection," Broxton said. "Overcoming this problem opens up this technology to a much wider audience," he further added.
The research team, led by Broxton and Paul Debevec, Google senior staff engineer, plans to demonstrate the new system at virtual ‘SIGGRAPH 2020; conference. "Completing this project feels like we've overcome a major obstacle in making virtual experiences realistic, immersive, distributable, and comfortable," added Debevec.