Magic Spell Studios Virtual Production
The visual effects industry is embarking on a revolutionary change in the way film and TV productions are created. Virtual Production (VP) uses LED panels and gaming software to create a dynamic background that is “live” during filming. VP was first used in the TV series The Mandalorian. I immediately knew this innovative approach would change the industry and began researching and studying this process, including creating a miniature VP set up in my home office. I have used my knowledge to help develop new ways to use this technology, to educate our team at RIT, and to guide students towards this emerging field, two of whom are now working at Optic Sky Productions as VP artists.                                                       
I am proud to be part of the VP Team at MAGIC Spell Studios, a group of faculty from various departments in the College of Art and Design.  In 2020 we applied for the Epic Mega Grant which requested funding for research in support of a VP curriculum, to acquire LED panels, and camera tracking equipment to create a VP Studio. At the time we received this $275,000 grant, it was the largest grant Epic had provided to any institution. 
Before the LED wall installation I began researching how to use Epic's nDisplay in preparation for putting images on the monitor (see video 1). As this was a new plugin there was little available information so research and learning required trials, errors, and more trials. I had to show the images on the screen while they were interacting with the real-world camera. Once the scene was visible on screen, the next step involved getting the Oculus controller to move the virtual camera while I moved my cell phone camera with the Oculus controller. Once this was accomplished I had to span the image across two screens which required further research and trials (see video 2). The final aspect was sharing all of this learning and knowledge with my colleagues at Optic Sky Productions and the MAGIC Spell Studios VP Team. 
The wall was installed in March 2021 bringing Virtual Production to RIT and Western New York. In collaboration with Tim Stringer and Emily Halderman from Optic Sky Productions I worked to connect the gaming software Unreal Engine (Unreal) to the LED wall and track the camera. Because the gaming software is tracking the camera it creates the illusion that the camera's perspective is changing which creates depth, accurate lighting, and parallax.
As often happens with technology Unreal was updated (4.26 to 4.27) with significant changes which meant we had to completely reconfigure our software pipeline the next semester. I took the lead and worked countless hours until I had the entire process working again. I kept our team updated with each bit of progress and taught them the revised process.
At a Glance:
VP Team Member of Epic Mega Grant Writing Team
Research and implement workflow for tracking the camera
Research and implement workflow for projecting Unreal to LED wall
Clean geometry for virtual electric bike for texturing
Educate VP Team on progress and new processes 
Key Contributor to 2021 White Paper documenting our first VP from start to finish with analysis of learnings and best practices around workflows associated with the various technologies utilized.

(VIDEO 1) First success using Unreal nDisplay on one monitor

(VIDEO 2) First success using Dual Monitors

Back to Top