Oscar-winning ‘Dune: Part Two’ VFX Supervisor Paul Lambert turned to Foundry’s CopyCat machine learning toolset within Nuke to complete 40% of the 1,000 Fremen eye shots with no additional touch-ups, saving thousands of hours of roto work.
Since the 1920s, when silent film featured artificial intelligence (AI) on the big screen, the technology has driven narratives across the entertainment industry.Nearly a century later, AI - more specifically, generative AI - has moved from sci-fi concept to reality, with many AI assistants integrated into devices we use daily.
While AI is actively enhancing our personal lives and many industries, its arrival hasn’t come without skepticism, especially in the creative realm.Today, AI can generate a semi-passable script or mostly convincing visuals, but when it comes to storytelling, it lacks the emotional capacity and imagination to tell great stories.However, non-generative machine learning (ML) can deliver significant efficiencies without raising ethics and copyright concerns.
While generative AI produces new content influenced by training data scraped from the internet, machine learning models are often designed to recognize and replicate existing patterns using much smaller datasets.Visual effects (VFX) processes can be laborious, but with the right ML tools and training, artists can reduce time-consuming repetitive tasks and instead focus on refining standout elements of their work.Artists can apply their unique skills and knowledge to help train the ML system, rather than rely on generic pre-trained tools, to get a head start on more routine tasks.
Their creativity and insight remain essential for achieving the best shot outcome, but smart use of ML can help free up time for iteration and exploration.An eye on ML In the fictional franchise, it’s hard not to notice the piercing blue eyes of the Fremen people who inhabit the planet Arrakis.What most people don’t know is that depicting these eyes on screen for the 2021 feature film required VFX artists to manually rotoscope each actor’s eyes, including different matte layers for the pupil, iris, sclera, and reflections.
It was a painstaking process further complicated by clothing occlusions and motion blur.That attention to detail helped win Best VFX at the 2022 BAFTAs and Academy Awards. When preparing for which features almost four times the Fremen including many in large crowd scenes, VFX Supervisor Paul Lambert turned to Foundry, who had recently introduced CopyCat, a machine learning toolset within Nuke, the industry-standard compositing software.While many machine learning tools are trained using data from undisclosed sources, Nuke’s CopyCat is trained on artists’ own material, resulting in more reliable output with no issues around data provenance.
When using Nuke’s CopyCat, artists enter a small number of frames they’ve crafted themselves, run the training process, and leave CopyCat to learn from their own source material.This results in a machine learning model with predictable and targeted characteristics that can be deployed across many different shots, saving artists time on laborious tasks.Artists can further refine their CopyCat model with more of their own data, for example to adapt it to the specific look of different sequences or shows.
Lambert trained his own model for eye replacements using data from and updates to the CopyCat toolset in Foundry’s Nuke compositing software.In a test of CopyCat’s scalability, the production team took 280 existing Fremen shots and their corresponding mattes, then cropped and augmented the images to create a dataset comprising 30,000 eyes.CopyCat had never previously been deployed at this scale.
After training the model, they applied it to shots for .Any shots that required manual correction were fed back into the dataset to improve subsequent outputs.40% of the 1,000 eye shots completed for the film required no additional touch-ups, saving artists from thousands of hours of roto work.
What’s more, the production team used CopyCat to edit out an actor’s tattoos in the black and white Giedi Prime scenes.Shot with traditional and infrared cameras to achieve a very distinct look, a screen test revealed the tattoos were still visible – despite being covered with makeup – when viewed through the infrared camera.After training a model, the team digitally removed the tattoos, without manual 3D object tracking, saving considerable time and effort.
Repeating the first film’s achievement, earned the Best VFX BAFTA Award and the Best VFX Academy Award at the 2025 ceremonies.Simplifying workflows, securely VFX artists are often challenged to make a small change throughout a chunk of footage that, while important, is incredibly tedious to execute.This could be something like removing a tattoo, mustache, or bruise from an actor’s face or applying digital makeup.
Artists that use their own data to train models in CopyCat, can use ML to reduce the work on these types of tasks, without worrying about copyright infringements or having their data end up in a larger training pool. The first step involves feeding the secure ML framework before and after images.These train the neural network to generate the desired output.As with any ML application, the better the training set, the better the result.
Implemented strategically, ML has the potential to drastically accelerate the more cumbersome aspects of VFX workflows, and not just for big budget projects.Some films, like those in the superhero or fantasy genre, clearly have an assist from VFX.No matter how exceptional the digital artistry is, the audience knows the physical limitations of humans, even if they suspend their belief.
Many period dramas also use VFX, but for things like transforming modern locations instead of creating visual spectacle.Digital artistry was essential for depicting post-war New York City in Italian period drama , released in late 2024.VFX Supervisor Victor Perez once again leaned on ML, this time to replace the floor drains in a key storytelling sequence.
Achieving the swap manually would have taken three artists about a month, requiring significant time and resources that could otherwise be allocated to more creative tasks like recreating the 1949 New York skyline.By using CopyCat, Perez and his team were able to preserve the shot, and align it with the director’s vision, while keeping the production budget and timeline on track.Future considerations AI is a new reality, but its impact on the creative industry remain to be seen.
What’s become increasingly clear though is that ML is already benefitting the VFX industry in new and surprising ways.AI isn’t and shouldn’t ever be a replacement for human creativity, but its value in enhancing the creative process is certainly something worth exploring.Ultimately, the underlying principle of ML is artist empowerment.
They can use their own skills, knowledge, and existing work for training, as opposed to reliance on tools pre-trained with broad, generic datasets built to cater for all use-cases.It’s this approach that CopyCat was built on, and which gives artists the high-quality outputs they need.Undoubtedly, ML capabilities will become increasingly common in modern digital content creation toolsets, and artists who embrace them will stay ahead of the curve.
Adam Cherbetji is Principal Product Manager - Machine Learning, Foundry.
Foundry Releases Nuke 16.0
NVIDIA Launches NVIDIA AI Foundry
Paul Lambert Returns to Arrakis for ‘Dune: Part Two’ VFX
‘Flow,’ ‘Dune: Part Two’ Take Home Oscar Gold
‘Wallace & Gromit: Vengeance Most Fowl,’ ‘Dune: Part 2’ and ‘Wander to Wonder’ Nab BAFTA 2025 Wins