I’ve been in the entertainment industry since I was nine. I joined the Screen Actors Guild (SAG) when I was 11 in 1977, the Writers Guild of America (WGA) when I was 22, and the Directors Guild of America (DGA) the following year. I got my start as a child actor on Broadway, studied film at NYU, then went on to act in movies like The Lost Boys and the Bill & Ted franchise while writing and directing my own narrative work. I’ve lived through several labor crises and strikes, but none like our current work shutdown, which began last spring when all three unions’ contracts were simultaneously due for renegotiation and the Alliance of Motion Picture and Television Producers (AMPTP) refused their terms.
The unifying stress point for labor is the devaluing of the worker, which reached a boiling point with the rapid advancement of highly sophisticated and ubiquitous machine learning tools. Actors have been replaced by AI replications of their likenesses, or their voices have been stolen outright. Writers have seen their work plagiarized by ChatGPT, directors’ styles have been scraped and replicated by MidJourney, and all areas of crew are ripe for exploitation by studios and Big Tech. All of this laid the groundwork for issues pertaining to AI to become a major flashpoint in this year’s strikes. Last summer, the DGA reached an agreement with the AMPTP, and on Tuesday the WGA struck its own important deal. Both include terms the unions hope will meaningfully protect their labor from being exploited by machine-learning technology. But these deals, while a determined start, seem unlikely to offer expansive enough protections for artists given how much studios have invested in this technology already.
The DGA’s contract insists that AI is not a person and can’t replace duties performed by members. The WGA’s language, while more detailed, is fundamentally similar, stating that “AI can’t write or rewrite literary material, and AI-generated material will not be considered source material” and demanding that studios “must disclose to the writer if any materials given to the writer have been generated by AI or incorporate AI-generated material.” Their contract also adds that the union “reserves the right to assert that exploitation of writers’ material to train AI is prohibited.”
But studios are already busy developing myriad uses for machine-learning tools that are both creative and administrative. Will they halt that development, knowing that their own copyrighted product is in jeopardy from machine-learning tools they don’t control and that Big Tech monopolies, all of which could eat the film and TV industry whole, will not halt their AI development? Can the government get Big Tech to rein it in when those companies know that China and other global entities will continue advancing these technologies? All of which leads to the question of proof.
It’s hard to imagine that the studios will tell artists the truth when being asked to dismantle their AI initiatives, and attribution is all but impossible to prove with machine-learning outputs. Likewise, it’s difficult to see how to prevent these tools from learning on whatever data the studios want. It’s already standard practice for corporations to act first and beg forgiveness later, and one should assume they will continue to scrape and ingest all the data they can access, which is all the data. The studios will grant some protections for highly regarded top earners. But these artists are predominantly white and male, a fraction of the union membership. There will be little to no protection for women, people of color, LGBTQIA+, and other marginalized groups, as in all areas of the labor force. I don’t mean to begrudge the work of the DGA and WGA in crafting terms that may not adequately represent the scope of the technology. But we can go further—and SAG has the opportunity to do so in its ongoing negotiations.
SAG is still very much on strike, with plans to meet with the AMPTP next on Monday. In their meeting, I hope they can raise the bar another notch with even more specific and protective language.
It would be good to see terminology that accepts that AI will be used by the studios, regardless of any terms thrown at them. This agreement should also reflect an understanding that studios are as threatened by the voracious appetites of Big Tech as the artists, that the unions and the AMPTP are sitting on opposite sides of the same life raft. To that end, contractual language that recognizes mutual needs will serve everyone’s interest, with agreements between AI users and those impacted by its use on all sides of our industry. It would also be helpful to see language that addresses how AI’s inherent biases, which reflect society’s inherent biases, could be an issue. We must all make a pact to use these technologies with those realities and concerns in mind.
Mostly, I hope everyone involved takes the time to learn how these technologies work, what they can and cannot do, and gets involved in an industrial revolution that, like anything created by humans, can provide tremendous benefit as well as enormous harm. The term Luddite is often used incorrectly to describe an exhausted and embittered populace that wants technology to go away. But the actual Luddites were highly engaged with technology and skilled at using it in their work in the textile industry. They weren’t an anti-tech movement but a pro-labor movement, fighting to prevent the exploitation and devaluation of their work by rapacious company overlords. If you want to know how to fix the problems we face from AI and other technology, become genuinely and deeply involved. Become a Luddite.
WIRED Opinion publishes articles by outside contributors representing a wide range of viewpoints. Read more opinions here. Submit an op-ed at ideas@wired.com.
Source