For bookworms, reading a headline like “School District Uses ChatGPT to Help Remove Library Books” can be blood boiling. As Vulture put it earlier this week, it creates the sense that the artificial intelligence tool is once again “[taking] out its No. 1 enemy: original work.” And it is. Using ChatGPT’s guidance, the Mason City Community School District removed 19 titles—including Margaret Atwood’s The Handmaid’s Tale and Toni Morrison’s Beloved—from its library shelves. But there is another truth: Educators who must comply with vague laws about “age-appropriate” books with “descriptions or visual depictions of a sex act” have only so many options.
Signed into law by Governor Kim Reynolds in May, Iowa’s SF 496 is one of those “parental rights” bills that have become popular with Republican lawmakers of late and seek to limit discussion of sexuality and gender identity in schools. (Some have likened Iowa’s bill to Florida’s “Don’t Say Gay” legislation.) Its stipulations are a sweeping attempt at eradicating any discussion of sex or sexuality, and as Mason City School District’s assistant superintendent Bridgette Exman explained in a statement to the Mason City Globe Gazette, “it is simply not feasible to read every book and filter for these new requirements.”
Under the surface of this is a unique conundrum. Broad bans on sexual content that use vague language like “age-appropriate” already leave too much room for interpretation. It doesn’t matter if what’s in the book is the equivalent of softcore slashfic or a harrowing account of childhood molestation. Now, in Iowa, there’s a case of AI—which doesn’t always fully comprehend nuance in written language—being asked to interpret a law that already lacks nuance.
The result, then, is districts like Mason City asking ChatGPT, “Does [insert book here] contain a description or depiction of a sex act?” If the answer was yes, the book was removed from the district’s libraries and stored. But what about when the answer was neither yes nor no? The Bible, for example, “does contain passages that address sexual topics and relationships, but it generally avoids explicit descriptions of sexual acts,” according to ChatGPT. The Bible isn’t on the list of 19 books that got banned, but you can see how quickly this can get confusing. (David going to bed with Bathsheba isn’t a description of a sex act? Uh, OK.)
When I relate this story to Exman, she says she got similar answers, where ChatGPT would say a particular book had sexual depictions but then give context. The example she gives is Patricia McCormick’s Sold, about a young girl who gets sold into prostitution. “ChatGPT did give me what I would characterize as a ‘Yes, but’ answer,” Exman says, but “the law doesn’t have a ‘yes, but.’” Ergo, McCormick’s book is one of the 19 on her district’s list.
Source