I am proud to be a part of the virtual production team at MAGIC Spells Studio at RIT. This began with our group of faculty who were interested in exploring virtual production as a way to apply the learned knowledge to the classroom. This team applied for the Epic Mega Grant. At the time of receiving it, this was the largest grant given to an institution from Epic. With that grant we used the funds to bring a large LED wall to the MAGIC Spell sound stage.
Once the wall was installed in the studio, Tim Stringer, Emily Haldeman and I worked to figure out how to connect the Unreal Engine to the LED wall and track the camera. This is needed so that when the camera moves, the perspective on the LED wall will be updated to give the illusion of depth.
Unreal Engine updated from Version 4.26 to 4.27. What we learned about putting the images on the wall, had completely changed. It me another two weeks of studying the new workflow and process to get the proper image and tracking on the wall. It was a change for the better but being so extremely new, there are just a small sample help documentation to get us back on track
- Helping to set up the LED wall and Large screen projection
- Research and implement workflow for tracking the camera
- Research and implement workflow for projecting Unreal to LED wall
- Clean geometry for virtual electric bike for texturing
Some of my first tests trying to get nDisplay to work on a monitor are shown below. I was working to solve many problems getting the images to show on the screen and interact with the real-world camera. Once I got the scene to show on the screen, I needed to figure out how to get the Oculus controller to move the virtual camera while I moved my cell phone camera with the Oculus controller. This took several weeks to get working right.
After I got one screen working, I needed to figure out how to span the image across two screens. In the second video, you can see the results of that research. This enabled me to then pass on this knowledge to Tim Stringer and Emily Haldeman at Optic Sky who were preparing to also work on nDisplay.
The button above is a link to the published case study on our experience setting up a virtual production stage
(this can also be found on the RIT site above)
In this white paper I as a key contributor to the pipeline workflow section of this document. This represents our collective first efforts working with a digital LED wall in combination with the Unreal Engine for virtual production. We were working with Unreal 4.26 at the time. This is significant in that the technology is cutting edge for the industry. The amount of documentation and video examples of how to properly bring all these technologies together was very limited. With the techniques in this white paper we were able to successfully create a multi award winning commercial.