Extract from ABC News
Australian authors including Peter Carey, Helen Garner, Tim Winton, Jane Harper and Miles Franklin have been swept up in a Silicon-Valley-based AI scandal.
Analysis of a dataset of pirated ebooks, known as Books3, has revealed works by some of the world's most successful authors, including John Grisham, Colleen Hoover and Stephen King, have been used to train generative AI.
"[We know] AI systems are trained by ingesting vast amounts of text … scraped from the internet," says Olivia Lanchester, CEO of the Australian Society of Authors (ASA).
"But the lack of transparency over what has been used to train generative AI means that authors haven't known whether their works have been used."
Now, many do know — and they are furious.
"We have been receiving phone calls and emails from Australian authors who've been dismayed and outraged to learn that their works have been appropriated without their permission," Lanchester says.
Literary fiction by Margaret Atwood, Zadie Smith and Barbara Kingsolver, and books by Australian authors Geraldine Brooks, Liane Moriarty, Markus Zuzak and Robbie Arnott also feature in the collection.
What is Books3 and what happened?
In September, The Atlantic published a tool to search Books3 after reporter Alex Reisner identified author information for 183,000 of the 191,000 ISBNs he extracted from the dataset, used to train Meta's LLaMA, Bloomberg's BloombergGPT and EleutherAI's GPT-J.
The Books3 creator, Shaun Presser, told The Atlantic he developed the dataset as a high-quality training resource for other independent developers to allow them to compete with tech giants such as OpenAI, which is believed to have trained ChatGPT using mystery datasets known as Books1 and Books2.
The Silicon Valley way
Irish-born crime writer Dervla McTiernan was incensed to learn her bestselling crime novel, The Ruin (2018), was one of the thousands in the collection.
"It's outright theft. People who stole all of these books … did it for the purposes of making money," she says.
McTiernan, who lives in Perth, believes the companies knew the dataset contained stolen material.
"They knew they were using pirated books, and they did so with gross indifference, and I think that's characteristic of the mentality of people who work in this industry," she says.
"It was Facebook that had the motto … 'move fast and break things'. Well, we're the things that are being broken, and we're not very happy about it."
Weren't there other options?
Professor Toby Walsh, Chief Scientist at UNSW's AI Institute, is one of Australia's leading AI experts.
He was disappointed to find one of his books, Machines Behaving Badly: The Morality of AI (2022), in the Books3 dataset.
"If you just want to train a chatbot to speak English, there are tens of thousands of books that are out of copyright that they could have used," he says.
"It's typical of the cavalier way that people in Silicon Valley treat people's intellectual property."
Lanchester agrees.
"AI developers could have trained their systems on works that are in the public domain, or they could have sought a license from the copyright owners, but instead, they've ingested copyright works without seeking permission," she says.
"This has been done disregarding copyright laws, and it's treating authors' works as though they're a public commodity, which ignores the fact that there's a real cost of creation and that licensing is how authors earn a living."
Lanchester says the ASA is not "anti-technology" but argues that tech companies should acknowledge the value of the authors' work in the development of generative AI tools.
"They couldn't have built the AI systems they've built without high-quality inputs, and there was a real opportunity for authors' works to be licensed and for the creative and intellectual labour to be recognised. Instead, their works have just been appropriated," she says.
Legal ramifications for AI overreach
Tech giants, including Meta and OpenAI, face several lawsuits in the US over their alleged use of authors' work without their permission.
Two authors, Mona Awad and Paul Tremblay, filed a lawsuit against Open AI in July, claiming OpenAI used their copyrighted books to train ChatGPT without their consent. They say the AI tool generates "very accurate summaries" of their work.
Other lawsuits have been filed against the tech companies by comedian Sarah Silverman and Michael Chabon, who won the 2001 Pulitzer Prize for Fiction for his novel The Amazing Adventures of Kavalier & Clay.
In September, a group of high-profile authors, including George RR Martin, Jodi Picoult, John Grisham and Roxane Gay, filed a joint suit with the Authors Guild against OpenAI.
While the suit accuses the company of "systemic theft" and "mass copyright infringement", OpenAI and Meta have claimed their use of authors' work amounts to "fair use".
"This is a new way of using people's copyrighted material, and the courts have yet to decide whether it's fair use, whether it's within the bounds of the law or not," says Walsh.
A meaningful threat
Lanchester says generative AI poses a direct threat to authors' livelihoods.
"We're concerned that the market is going to be flooded with inferior AI-generated content.
"A market crowded with AI material doesn't serve anyone. It makes discoverability harder for professional writers and lowers quality for consumers."
McTiernan also questions the viability of a literary career in a marketplace dominated by AI.
McTiernan says ChatGPT might be a "bad writer" now, but she worries that in three to five years, generative AI tools will be able to create works similar in character to contemporary authors that will be offered in direct competition with their original novels.
"If [tech companies like Amazon, Apple and Meta] are able to generate their own books and stories, why would they sell anything else? Why would they give prominence in the market to any sort of competing efforts? I don't see why they would."
What will Australian authors do next?
Writing is already a precarious occupation in Australia, where the average annual income for an author is $18,200.
While McTiernan says the writers she has spoken to "would love to have the opportunity to take action", for many, it is not an option.
"[Given] what authors are earning as a general rule in Australia, how many authors are in a position to fund or even contribute to the costs of litigation? Very few," McTiernan says.
"This is, again, one of the situations we find these days where these massive wealthy companies can take this sort of action, almost with impunity, because they know the people that they're damaging most are not in a position to take action against them."
Lanchester says ASA intends to continue to advocate for authors.
"We are advocating to the government and hoping there will be a pathway forward to negotiate licensing solutions with the tech sector," she says.
"We're watching the US litigation brought by the Authors Guild with real interest and absolutely cheering those authors on, and … we're trying to get a handle on the number of Australian authors affected. We think that there will be many.
"It's incredibly difficult because there are complex legal and jurisdictional issues to grapple with."
While Professor Walsh concedes the risks posed by AI may require specific regulation, he says existing laws governing intellectual property and privacy could be applied "more forcibly to the tech space" to rein in unlawful conduct.
"I'm not sure that now the tech industry is such … a large component of our lives whether it can be left to be the free-for-all, to be the Wild West that it has been for the last 20 years."
No comments:
Post a Comment