The long game between writers and AI

With help from Derek Robertson

If you’re watching the quick march of generative AI across the landscape, and wondering what might slow it down, the Writers Guild of America would like a minute of your time.

The Hollywood writers waging a high-profile strike right now have a very notable clause in their contract demand: “AI can’t write or rewrite literary material; can’t be used as source material” and writing covered by the contract “can’t be used to train AI.”

The WGA clause has already been grabbing attention as one of the first mentions of generative AI in a union negotiation — and a clear expression of the white-collar anxiety about this wave of automation.

I sat down with POLITICO labor reporter Nick Niedzwiadek, who recently wrote about AI’s growing footprint in labor negotiations, to figure out where generative AI fits into the boiling labor tensions in the entertainment industry — and the power of the union to hit the brakes on industry-wide adoption of this new technology.

To be clear, AI isn’t the reason the writers went on strike, said Nick, “but it underscores their bigger picture concerns, which is that on multiple fronts, they’re kind of being squeezed by studios.”

In fact, one of the most significant reasons for the strike is a previous technological revolution that upended the way writers make money: the shift releasing content on network and cable TV and movie theaters to streaming services.

It used to be that writers could rely on residual payments from older projects to bolster their livelihood, as network TV re-ran popular shows. The rise of internet-based streaming services like Netflix fundamentally changed the distribution formula that determined how much a writer would make from a project. So you’re no longer getting that longer term payout from the work you did,” said Nick. And a creative professional who could rely on a growing body of work to pay the bills now needed to find new jobs constantly just to stay afloat.

So when the WGA comes to the bargaining table to discuss AI, it’s with the knowledge that the members it represents have seen studios profit off of their intellectual property without seeing the benefits themselves. “You can read the AI thing in that way — basically, ‘we want rules, because we don’t trust that if we don’t have something in paper that’s enforceable, that you’re not going to use it against us,’” Nick said.

So are chatbots really a threat to writers? Definitely. ChatGPT and its ilk might have trouble with facts and reliability, but they were trained on millions of text sources from the internet (without permission from content creators) and can be fine-tuned for specific use cases like screenwriting. So while they might not be able to produce a perfectly polished script, they do excel at spitting out plausible premises for new shows modeled on older, successful projects — which is a big part of how creative writers earn their living.

“Right now, there’s people who have to come up with those episode prompts or premises. And if you’re just using a chatbot, then that’s fewer writing jobs,” Nick explained. The union’s opening demand: zero chatbot content in scripts at all — unionized writers say they don’t want to be underpaid to clean up AI content.

In its most public statement to date, the Alliance for Motion Picture and Television Producers said AI is not eligible for writing credits, but stopped short of saying it would avoid the technology. Those credits determine the payout writers get from a project. And while AI not getting writing credits could be yet another way to cut the cost of hiring an actual writer, to our knowledge, no studio has attempted to use AI to replace a writer — which makes sense, given how much of a hot button issue AI use currently is. (Then there’s the added complication for studios that AI-written material currently cannot be copyrighted.)

So if this is an attempt to get ahead of technology for once, what are the chances that the WGA could set rules of the road for how the entertainment industry uses AI going forward? Well, unlike its 2007 strike, this time, the WGA has “more buy-in from other parts of the industry,” Nick said. More parts of Hollywood — even if their issues are slightly different — they’re still kind of all in the same boat, feeling this squeeze from the studios and companies.”

Laura Blum-Smith, WGA’s director of research and public policy, said the union is OK with writers using AI as a “research tool,” but does not want “an executive having AI generate some material and then a writer is brought in and paid a small amount to polish it up.”

So now what?

Unions hold a lot of power in Hollywood. SAG-AFTRA, which represents actors across the industry, is also focused on the AI issue: it recently released a statement on using AI to simulate actors’ voices, likenesses and performance (they said any use of digital simulations must be bargained with the union). “We don’t have a problem with AI technology being used as augmentation,” said Duncan Crabtree-Ireland, SAG AFTRA’s national executive director and chief negotiator. But he agrees with the WGA that there should be rules — like “informed consent and proper compensation to performers.”

And watch for any mention of AI as the Directors Guild of America begins its next contract negotiations today.

crypto on the hill

Amid House Republicans’ big push to move legislation on crypto, POLITICO’s Morning Money newsletter caught up with the congressman leading that effort.

Rep. French Hill (R-Ark.) told POLITICO’s Zachary Warmbrodt that the Securities and Exchange Commission’s approach to crypto has been “too sweeping,” and that the new bill will define which crypto products should be classified as securities, commodities, or something else, a long-running debate in the world of crypto regulation.

Hill said a bill could appear in “a few weeks,” but it’s still unclear whether it’ll carry bipartisan support. POLITICO’s Eleanor Muller notes that Republicans on the House Financial Services Committee are still building consensus merely among themselves. (A long-awaited bill regulating stablecoins could also be on the table, but buy-in from the Treasury Department and Federal Reserve is far from a sure thing.) — Derek Robertson

an 'is it real?' project gathers steam

The Content Authenticity Initiative, the Adobe-led coalition of companies working on online information authentication — in other words, how can you be sure you’re looking at “real” and not AI-generated content? — has some interesting new members, including Stability AI and Universal Music Group.

The organizations have agreed to provide users with the option to add a content certification standard called C2PA, which would encode AI-generated media with metadata information, like which model was used to create an image or a video. Stability AI is testing these credentials for users of its API now, calling it a stepping stone towards greater transparency around AI use. — Mohar Chatterjee

Tweet of the Day

the future in 5 links

Stay in touch with the whole team: Ben Schreckinger ([email protected]); Derek Robertson ([email protected]); Mohar Chatterjee ([email protected]); Steve Heuser ([email protected]); and Benton Ives ([email protected]). Follow us @DigitalFuture on Twitter.

If you’ve had this newsletter forwarded to you, you can sign up and read our mission statement at the links provided.