First Fader Use Case with The Center for Investigative Reporting: “Disfellowshipped”

We have been working with the Center for Investigative Reporting on a first Fader use case for months. The outcome is incredible: A three episodes long mini VR series about Debbie McDaniel, a “woman with an extraordinary past.”

The idea of a tool for journalists that would allow to quickly and easily turn their reporting into VR experiences was born back in 2015. Our Stephan and Linda went to a TechRaking event, hosted by The Center for Investigative Reporting (CIR) and supported by the Google News Lab.


In order to develop a tool like Fader, we needed to know more about journalistic requirements and we were able to start working closely with CIR’s reporter Trey Bundy and senior supervising editor David Ritsher. Trey was working on an investigation into Jehovah’s Witnesses and child sexual abuse claims. This is the story, you can watch it on your Android device or on your desktop browser (try Chrome or Firefox).

“Disfellowshipped” 3 episodes long VR mini-series


The reason why Trey was interested in experimenting with an investigative story in VR is because it can

“give the viewer a more intimate understanding of a character and her experience. The technology allows us to put you in the reporter’s shoes, to feel what it’s like to sit with people as they look you in the eye and tell you their story, to visit their towns and the places that affected their lives. In some instances, it becomes a window into a person’s emotional memory.”

During a cold winter weekend in February 2016, Trey met with Linda and Stephan to begin exploring what by now is the 360° episodical Disfellowshipped. They were digging through text pieces, images and audio snippets and trying to align all that into a visual 360° concept. That proved to be a challenge that Trey more than once quoted to be like creating a movie while never having been to a cinema before.

Technical progress

Stephan had made his first steps into creating VR with the game engine Unity 3D. To support Linda and Trey in expressing their ideas, he needed something to rapidly prototype with and that allows quick testing. That’s when A-Frame came into play, allowing to create VR scenes with a few lines of markup. It allowed them to express an idea in minutes and instantly check the results with an Oculus Rift attached.

Our requirements were to support spherical images/videos, 2D photographs, audio, and text in a scenic sequence. A-Frame was the perfect framework for that, since it encapsulates the power of Three.js in a developer friendly way. We were able to use most of the existing components and especially the React Wrapper created and maintained by Kevin Ngo was very helpful to us.

The challenging parts came, when we decided we did not want to preload a full video, but rather stream the content. Getting this to work on a range of devices (unfortunately iOS is still left out of the game, since it does not render the movie texture) was a process of several weeks. Since we don’t just play a spherical video, but have timed elements happening in between, we needed to have events trigger at certain progress stages of the streamed video – to keep it all in sync when the video might be buffering. Thanks to A-Frame, we were able to call animations upon events to achieve this.

Finally, when we had our two episodes and the epilog put together, creating the menu with gaze-based interaction was an easy task.

At Vragments we are also using A-Frame as the rendering framework for our VR tool Fader.

Further Links:

Like this article?

Share on facebook
Share on Facebook
Share on twitter
Share on Twitter
Share on linkedin
Share on Linkdin
Share on pinterest
Share on Pinterest