Jump To Top


Giving 100%: What An Award Winning Documentary Tells Us About the Future of VR

THE 100%” was one of the most compelling and watched documentaries at last year’s Tribeca Film Festival. The film told the story of Maggie Kudirka, an enormously talented 23-year-old ballerina with the Joffrey Concert Group, whose career was derailed due to metastatic breast cancer. Maggie’s story was told through an immersive virtual reality documentary. In this first-of-its-kind approach to storytelling, Springbok Entertainment broke new ground in immersive storytelling. The documentary earned a TribecaX Award for Best VR Experience, an Emmy nomination, a VR Social Impact Award, and many more accolades.

Setting the Stage with Reality Capture

The setting for the documentary was the historical Orpheum Theater in Vancouver with reality capture technology cast in the best supporting role. The technology helped create an immersive viewer experience, and, arguably was a key factor driving the recognition that the documentary earned.

To set expectations, or more accurately, to set no expectations, we didn’t tell our early test viewers how the documentary was made. Instead, they put on a VR headset and were virtually transported to the stage, where they felt as if they were watching a private ballet performance by Maggie.

This type of viewer immersion helped to invoke a greater sense of empathy for the ballerina, as evidenced by the many people that were moved to tears after seeing the documentary.

Since “The 100%” was the first immersive documentary of its kind, many aspects of the production were theoretical or instinctual. When it came to selecting technology to create the film, here’s what we learned.

First we needed to capture all the details of the stage prior to filming. To do this, we thoroughly researched available products. We selected the Microsoft volumetric capture technology for production work. We also chose two Leica BLK360 imaging laser scanners enabled us to easily capture every millimeter of the theater in 3D. We filled the BLK360s with more than 60 high-quality scans, which provided an absolutely incredible amount of detail inside the auditorium and the lobby. In fact, we captured far more data than we needed, simply to have a massive data set to fine-tune our processing pipeline during the production process.

Complementing the BLK360’s scans, we used thousands of panoramic HDRI photos of the theater, photogrammetry, and volumetric video – a new category of immersive technology that helps create personal, beautiful and engaging viewing experiences. The high quality panoramas of the HDRI photos were used to texture the 3D mesh that we created from the BLK360 point cloud data.

Then we added volumetric video captures to the LiDAR and HDRI images to create a whole new level of immersion.

Creating a Reality Capture Workflow

In the early days of VFX, we used survey equipment, blueprints, reference photography, and guesswork to create a 3D model matching a real location. We always knew LiDAR was a better option, but for a long time it was considered a luxury. In recent years, however, it’s become more accessible as we realized using the BLK360.

The imaging laser scanner allows us to quickly and affordably capture billions of 3D points at a location like the Orpheum Theater. For example, we no longer have to shut down a location site to scan it because the product is lightweight and portable.

Using photogrammetry, we were able to capture more details in less time, but the amount of effort it takes to capture thousands of well-exposed, sharp photographs in varying lighting has always been a challenge. The error inherent to photogrammetry has also been a challenge for every 3D department because it used to require a lot of manual intervention.

LiDAR has become the cornerstone of our reality capture workflow, with photogrammetry and HDRI images contributing texture detail but not the underlying geometry. We’ve worked hard to bring these disparate technologies together into a single, coherent flow that uses the accuracy of LiDAR and the detail of photography to create “THE 100%.”

The Future of Reality Capture for VR in Documentaries

Producers and viewers are finding the use of reality capture in filmmaking more impactful due to what’s known as the “transportation effect.” It’s been likened to the difference between thinking about something you watched and remembering something you experienced.  The impact on the viewer is more powerful because the film is able to create what feels like the viewer’s own memories, making the viewer feel as if they are actually in the film.

Today, more realistic VR experiences are being created using reality capture and LiDAR. The sense of being in a real location is effective at helping a viewer feel like they are truly immersed in another perspective and place. In turn, this leads to higher engagement and a more memorable experience.

True-to-scale experiences help generate and establish empathy with an audience. This is becoming increasingly more important in storytelling and for documentaries in particular.

Without the ability to capture the real world in detail and quality, the viewer would have had an experience that fell flat and seemed more like a video game than a cinematic reality.

In the entertainment business, we often strive to distract and amuse an audience. Yet with “THE 100%,” we knew we had a rare opportunity to create art that will have a measurable impact on those dealing with cancer. By creating this immersive experience, we were able to inspire viewers to donate to SU2Cancer (Stand Up to Cancer), a partner in this project.

Source: Read Full Article