The Academy might require future Oscar submissions to disclose their use of artificial intelligence tools, IndieWire has learned — but what does that mean when AI is, well, everywhere?
Variety reported that eight-time Oscar nominee “A Complete Unknown” used AI in its post-production process. The disclosure came, oddly enough, via Australia-based Rising Sun Pictures’ submission for the Emerging Technology Award at the 2025 Visual Effects Society Awards for “Furiosa: A Mad Max Saga.” When the company cited Revize, its proprietary machine-learning character toolset, in its application, it added that Revize was also used on James Mangold’s Bob Dylan biopic.
Also nominated for the Emerging Technology Award at VES is “Dune: Part Two,” which used machine-learning model Nuke CopyCat to automate the addition of a blue tint to all the eyes of the actors playing Fremen, the native people of Arrakis.
With “Emilia Pérez” and “The Brutalist,” this brings the total number of current Best Picture Oscar nominees with confirmed use of AI tools to four. Currently, Oscar submissions have only an optional disclosure for AI use.
Sources tell IndieWire that, as the Academy prepares its annual review of its rules this spring, the current AI scandals are likely to push the issue of mandatory AI disclosure. However, trying to draw an AI line may be meaningless when the term itself is so all-encompassing.
Would you consider the selfie you took this morning made with AI? If it was shot on an iPhone made in the last three years, it’s likely it’s more AI-generated than any movie with an Oscar nomination. It’s built into the neural engines on Apple chips, allowing amateur photos to be more pleasantly and evenly exposed.
Does machine learning, which brings efficiency to a repetitive, time-consuming process, count as AI? In the “Dune” franchise, spice makes characters’ eyes turn blue. Should the VFX team that worked on the sequel not use software that could better identify and encircle the eyes, making it far easier to convert their color?
What about sound software on “Avatar: The Way of Water”? It used AI to differentiate sounds (a character’s voice, wind, footsteps, etc.), saving James Cameron’s sound team weeks of work in eliminating unwanted water noises on the production track. Revize was first used in Baz Luhrmann’s 2022 “Elvis,” another Oscar-nominated music biopic, to place star Austin Butler into old footage.
Like our iPhones, AI is incorporated into sophisticated software that movie productions use daily. The Academy can’t ask the thousands of crew members to vet and approve the programs and plug-ins they use.
Respeecher, used on “Emilia Perez” and “The Brutalist,” is an AI tool — but it’s not far afield from the common practice of Automated Dialogue Replacement that improves or clarifies dialogue in post-production.
Creatives are more uncomfortable with generative AI, which is based on tools that create data sets by analyzing existing works. “The Brutalist” director Brady Corbet denied that an architecture consultant working with production designer Judy Becker used MidJourney to draw buildings that inspired the ones used in the film‘s final sequence. “Judy Becker and her team did not use AI to create or render any of the buildings,” Corbet said in a statement.
Yes — and it’s also true that “The Brutalist” used GenAI to create two concept photos made to look like 1980s digital renderings; then, that image was hand-drawn. If the GenAI can’t be used in the concept phase, must all department heads’ mood boards hew to 100-percent organic, non-AI?