Magnopus Shows How New Technology Streamlines The Filmmaking Process

In 1893, Thomas Edison and his workers built what’s considered the first film studio. – the Black Maria. It had the latest technology, the Kinetoscope, which would form the foundation for cinematic projection. Two years later, in 1895, French engineers Auguste and Louis Lumière patented the first movie-projection devices, letting audiences watch films simultaneously.

More than 120 years later, the film industry is technology-driven, from visual effects to green screens, virtual productions, and robotic cameras.

Magnopus is known for its visual effects and production technology for the film industry including augmented reality (AR), virtual reality (VR) and virtual production in such films as The Mandalorian, Disney’s Remembering, Coco and The Lion King.

Magnopus co-founders Ben Grossmann and Alex Henning were awarded the Oscar for Best Achievement in Visual Effects for Martin Scorsese’s Hugo in 2012. The content-focused tech studio won a spot on Time’s Best Inventions in 2022 for their cross-reality, cross-device platform that brings together physical and digital worlds, Connected Spaces.

Sol Rogers, Global Director of Innovation at Magnopus, says Neural Radiance Field (NeRF) technology is a new approach to capturing a 3D scene and rendering it in real-time computer graphics.

New technology

NeRF technology uses deep learning algorithms to create a 3D representation of a scene from a set of images, or simple video, captured by a camera. The 3D representation can render highly realistic images from any angle or position, which can be projected onto an LED stage.

“LED stages for use in virtual production have become arguably the fastest-growing area of visual effects and production technology,” said Rogers. “Magnopus has been involved with LED stages from the very start, and we continue to advise and consult on multiple fronts when building out stages for the major film studios.”

“NeRF technology has the potential to reduce costs and streamline the filmmaking process,” said Rogers in an email interview.

For example, the company says the technology could eliminate expensive on-location shoots because highly realistic virtual environments can be created and displayed on the LED stage.

“Additionally, the highly realistic images produced by NERF technology may require less manual tweaking and editing in post-production, which could save time and money,” said Rogers. “But it’s just one component of the overall filmmaking process and may not have a significant impact on a film’s budget by itself.”

“NeRF technology also enables studios to create digital or virtual backlots, which is a cost-effective and time-saving alternative to building physical sets,” said Rogers. “This is especially true for large-scale productions that require multiple locations or entirely new environments that may not be possible to build in the physical world.”

Rogers says NeRF technology is a game-changer for filmmakers. “With its ability to render highly realistic 3D scenes in real-time, it has the potential to revolutionize the way virtual production and post-production processes are approached,” said Rogers.

“Imagine being able to capture and store an infinite number of virtual environments in a digital back lot, accessible at a moment’s notice, that look and feel as real as a physical set, but with complete creative control and flexibility,” added Rogers.

“That’s the power of NeRF technology. Filmmakers can easily modify and adapt the virtual environment to meet their needs without worrying about the limitations of physical sets or the costs and challenges of on-location shoots,” said Rogers.

Rogers says many new technologies are emerging that are blurring the lines between the physical and digital worlds, including AI, IoT, robotics, machine learning, big data, 5G, and extended reality, which are all part of this new filmmaking era.

“The sheer magnitude of the metaverse requires that AI be considered an integral part of its development,” said Rogers. “The metaverse promises to supplant almost every aspect of the human-computer interface and technology-enhanced human interaction.”

“AI will offer support in not only constructing the metaverse but also operating it,” said Rogers.

According to Rogers, the metaverse’s role in filmmaking is that it is simply the next internet. “It will be accessed through VR headsets, AR, mobile phones and computers or laptops,” said Rogers. “It’s very different from the current web (and even web3) – it’s spatial.”

“Up until now, we’ve interacted with computers via type or touch and a screen, and the user interface (UI) for spatial commuting will be completely different,” said Rogers. “Think eye-controlled interactions, body or hand gestures, and voice controls. Hardware will be invisible.”

“The concept of fixed computers and staring at a flat screen will be looked back at by our ancestors as absurd,” said Rogers.

“We already think and speak the language of 3D; we spend our lives moving through 3D spaces and interacting with 3D objects,” said Rogers. “Now that technology can enable this new era of computing, the transition away from 2D is happening.”

New ways to tell stories with technology

Rogers says that this shift means a new form of storytelling for films.

“Rather than just passively watching a film, audiences can participate in the world of the film and interact with characters and objects in real-time,” said Rogers. “That makes it something they’ll remember on a deeper level than just something they watched.”

VR technologies continue to grow industry-wide. Statista forecasts the VR market size to increase from $11.97 billion in 2022 to more than $24 billion by 2026. Museums, theme parks, concerts, and artists such as Coldplay, Björk and U2 have incorporated AR/VR into their performances.

Rogers says that in the past, big studios have used AR and VR experiences to help market films and TV shows.

“But now we’re seeing immersive experiences that enhance the viewer experience and extend the experience beyond the screen,” said Rogers. “For example, we created an AR companion for the short film Remembering starring Brie Larson; it connects directly in sync with the content on Disney+ to bring the ‘World of Imagination’ featured in the film into the viewer’s living room.”

Rogers says it provides an early look at the potential of AR experiences to enhance movie storytelling.

Rogers says Magnopus’s use of AR/VR depends on what needs to be done, but most of their work occurs at the front end when they help directors work out how to capture their movies. “We advise on what tech to use and employ the right technology to make their vision come to life,” said Rogers.

“We may also be engaged for a particular part of a movie or TV show. For example, we built a virtual environment for use in an LED volume while shooting parts of Westworld Season 4 Episodes 7 and 8,” said Rogers. “We worked closely with the Westworld Art Department to build a Times Square environment set 30 years in the future.”

Magnopus also designed a virtual production system for the re-make of The Lion King. “It is a multiplayer real-time collaborative platform that uses VR to put filmmakers inside their computer-generated films,” said Rogers. “By connecting interfaces to their traditional live-action cinematography equipment, we could give them and the crew an experience similar to live-action filmmaking.”

Artificial intelligence

“ChatGPT and Midjourney have demonstrated the capability of generative AI to revolutionize the pace at which creative images and text resources can be generated,” said Rogers. “We’re starting to see this capability transforming short-form videos too. For example, Gen-1 and Gen-2 by Runway allow you to synthesize new videos by applying the composition and style of an image or text prompt to the structure of your source video.”

Rogers believes the building blocks for longer-form content are also there. “It’s only a matter of time before audiences will be enjoying a new form of entertainment,” adds Rogers.

“These emerging AI technologies will create a new format of 100% ‘Synthetic Media,'” said Rogers. “Content will be created completely by the AI, from initial inception – based on your likes and dislikes – through to its creation, production and delivery to you in whatever format suits you best.”

“Instead of Netflix creating one movie for 100m fans, it will automatically create 100 million synthetic movies for individual fans,” said Rogers.

Rogers says that AI is a new set of tools that will act as powerful force multipliers in the hands of creative individuals, developers and artists.

“Content creators that leverage these tools will create better content more rapidly than without,” said Rogers. “Those that do not adopt these tools will be left behind. These tools won’t supplant jobs; they will make those that adopt them better at what they do.”

“The film industry is increasingly using AI in various aspects of the production process,” said Rogers. “One example is scriptwriting, where AI algorithms such as ScriptBook can analyze existing scripts and generate new ones based on patterns and trends in the data.:

Rogers says that casting directors can also use AI to analyze an actor’s previous performances and characteristics to predict how well they will fit a particular role, which makes casting decisions more informed.

“In post-production, AI systems like NVIDIA’s GauGAN can generate realistic landscapes and textures for films,” adds Rogers.

Rogers says that emerging Deepfake technology, which uses deep learning AI to create a realistic replica of someone by putting their face and body features onto another person, will also impact filmmaking.

“Imagine if your favorite series never had to end or your idol never died,” said Rogers. “This is a real possibility because Deepfake AI programs could be used to create an ultra-realistic replica of celebrities, and since Deepfake is not only limited to videos, realistic audio of the actor’s voice can also be cloned.”

“This is already happening in comedy shows like Deep Fake Neighbour Wars, which premiered on ITVX in January 2023,” said Rogers. “This is a big deal because it shows that Deepfake technology is becoming more accepted and popular in mainstream culture.”


Leave a Reply

Your email address will not be published. Required fields are marked *