DGA members have until June 23 to vote on the tentative deal with the studios approved earlier this month. Ratifying the deal is no slam-dunk, and one line on AI could bring the whole thing down.
“Employers may not use (Generative AI) in connection with creative without consulting the director or other employees covered by the DGA,” the provisional agreement law. The use of the word “consultation” hangs in the air.
“The Matrix” director Lilly Wachowski She said DGA-AMPTP’s legalese “reeks of ambiguity”. “Law & Order: SVU” showrunner Warren Leight said on Twitter that he’s been around long enough to know that taking the studio’s word on that whole “consultation” thing doesn’t mean much.
The problem is that there is no single, consistent definition of what such a “consultation” actually means. So we asked the only people who could make sense of something like this: entertainment lawyers.
“It’s more of a notification, but much less of an endorsement,” Pryor Cashman’s partner and co-president of media and entertainment groups, Simon Pulman, told IndieWire. “The studio or producer should tell the director about their plan to use generative AI and should give them an opportunity to express their thoughts and feelings about it. But there are no requirements in it to actually interact with that feedback.
So the studio or producer can’t just text or email the director about the planned use of AI and call it a day. They must engage in good faith. But that doesn’t mean the director has the final say — or veto power — on the matter.
It is understandable that some directors, writers and especially director-writers want more. Ivy Kagan Bierman, a partner at Loeb & Loeb and chair of the Entertainment Labor Group, sees ‘consultation’ as a substitute for ‘compromise’, which she called ‘very significant’.
“If the DGA deal stipulated that companies simply had to notify directors that they would be using AI in relation to creative elements, that would be a much bigger problem because directors would have no involvement in that decision-making,” he told IndieWire “The purpose of that consultation has to be meaningful, is to consider the director’s point of view, the director’s notes and comments, and then make decisions based in part on that consultation with the director.”
Kagan Bierman, Pulman, and a DGA-WGA member who spoke to IndieWire on condition of anonymity all agree on one thing: Thank goodness the agreement defines AI as not being a person.
“I was really pleasantly relieved to see them willing to say on the record that AI is not a person and that people are needed for these jobs,” the DGA-WGA insider told IndieWire. “I know that sounds like a really stupid thing. But that was really an existential dread that everyone felt due to the lack of accountability to the writer in terms of engagement on such matters.
Pulman may not be as optimistic as our DGA-WGA member, but he offered another silver lining that hinges on Hollywood being a relationship affair: if a studio wants to continue working with an A-list director, they better don’t finish your movie with Midjourney. The bad news? Not all directors have this status.
even worse? Even some AI advocates think the language of the DGA agreement is soft. Edward Saatchi of The CultureDAO, representing a collective of filmmakers tell stories with artificial intelligence (without the aid of studies), believes that directors deserve more autonomy than that granted to them by the pending agreement.
“This is the bare minimum and we can all go much further for a type of technology that is very disturbing to people,” Saatchi told IndieWire. “Hopefully it’s a director-driven process, rather than the producer checking in with the director, the director is empowered to figure out how to use generative AI for those who are making entirely AI-based films. It’s probably too weak and artists should have more control.”
Once that’s achieved, here’s what a DGA representative had to say: AI-speaking also applies to previously coded language surrounding consultation on creative elements such as the director’s cut, as explained in the section 7-202 of the DGA Basic Agreement 2020, which requires employers to consider the director’s advice and suggestions in good faith. The spokesman declined to comment further.
We’ll find out in a week’s time if the language was good enough for (majority) filmmakers. Regardless, the writers see a reason to keep resisting: AI threatens to impact writing credits more than it threatens to hurt filmmakers. The WGA argues that its fight is not related to the DGA – or that of SAG-AFRTA, for that matter. Pulman says he gets it, although he believes the DGA’s language will still be “useful” for the WGA to draw on at the negotiating table.