From An Experiment in Lust, Regret and Kissing (gift link!) in the Times by novelist Curtis Sittenfeld :
My editor fed ChatGPT the same prompts I was writing from and asked it to write a story of the same length “in the style of Curtis Sittenfeld.” (I’m one of the many fiction writers whose novels were used, without my permission and without my being compensated, to train ChatGPT. Groups of fiction writers, including people I’m friends with, have sued OpenAI, which developed ChatGPT, for copyright infringement. The New York Times has sued Microsoft and OpenAI over the use of copyrighted work.)
The essay describes a contest between the bot and the human novelist, who also employed suggestions from readers. I do not see how an AI “writer” being programmed with another author’s work is any more of a copyright violation than a human writer reading a book or story for inspiration. Herman Melville wrote “Moby-Dick” after immersing himself in the works of William Shakespeare. Nor is imitating another author’s style unethical. All art involves borrowing, adopting, adapting and following the cues and lessons of those who came before. In “Follies,” Stephen Sondheim deliberately wrote songs that evoked the styles of specific earlier songwriters. He couldn’t have done this as effectively as he did without “programming” himself with their works.Sittenfield, after presenting the 1000 word short story he wrote and the bot’s 1000 word competition, ends his essay thusly…
Here are some of the things I did to write a summer-themed short story that I’m pretty sure ChatGPT didn’t do:
started writing more than a week before I knew what the reader-selected prompts would be, because I was worried that if I tried to write too quickly, the results would be sloppy.
drove from my house in Minneapolis to the park in St. Paul where I’d decided to set the story to see what it looked and felt like.
included in the story a real Twin Cities-based David Bowie cover band, called The Band That Fell to Earth, whom I’d seen perform.
asked a biking enthusiast friend if Brian’s dating app handle MtnBiker1971 was so obvious that no self-respecting biker would use it.
asked the same friend what trail Brian would recently have ridden on, but then didn’t name any of the places he suggested because of space.
got feedback on my first draft from several family members and friends, including a friend who, like Cassie, lives in New York, is in her early 50s and intermittently uses dating apps.
changed the location of where Brian grew up from Mankato to Duluth because Kamala Harris selected Tim Walz as her running mate during the weeks I was working on the story, and Walz’s ties to Mankato made the mention of it feel distracting.
interrupted the writing process to look at the menu of a Thai restaurant I’ll be eating at soon and to check flight times to Cincinnati in October (and no, there’s absolutely nothing in my story that’s relevant to Thai food or Cincinnati — this was recreational).
wrote a first draft that was almost two times too long.
cut that draft but still asked my editor if I could exceed the agreed-upon 1,000-word length by about 200 words.
This is something I did that ChatGPT may have done:
chose character names by looking at popular baby names from the 1970s, when the characters would have been born, on the Social Security Administration website.
Something I definitely didn’t do that ChatGPT did:
wrote its story in 17 seconds.
That last detail feels the most, well, robotic and the most unsettling. Contrast that with the specificity and longing of readers’ suggestions — the margaritas, the linen caftans, the redemption. What I loved about them was that they were a mix of colorful, ridiculous, funny and poignant, which is also what, at our best, we as humans are — and what, as of this writing, it still seems to me that A.I. isn’t.
In human vs. machine, at least for the time being, I’m still betting on people.
He’s an optimist. I’m not sure that’s a smart bet.

Melville also based Moby Dick on the story of the Essex, so there is that as well. Fiction, but based on some true events. Another mark against their complaint.
This story made me remember that, about a week ago, I heard a new song on the radio. I don’t think it was a remake, but it was unmistakably in the style of the Charlie Daniels Band. My thought wasn’t ‘They are ripping off Charlie Daniels’, it was ‘that is neat to write a song in his style with instrumentals that sound like the Charlie Daniels Band’. I doubt Charlie Daniels would have sued him (if he were still alive) for it, he probably would have been flattered.
I am not sure what this writer’s complaint is. If a human writer bought his books and decided to write in his style, I don’t think he would be suing the person. If the person didn’t even buy the books, but read them from a public library, he wouldn’t either. I don’t see how this is different, from what the AI is doing.
The music industry has been forever rife with imitation and pretty blatant ripoff, often without proper (or any) attribution. Led Zeppelin and others lifted blues songs and changed them into rock, sometimes getting away with it, sometimes getting sued. Zeppelin, though, kindly left Rush alone in their attempt at being a sad, talent-starved, Canadian imitation (as one regular commenter here has often noted 😉 ). They probably felt sorry for them.
Rush? A ” . . . sad, talent-starved Canadian imitation”?! Sacrilege! Heresy! Blasphemy! Heathen! Oh, would some please rid me of this meddlesome poster?!
Sir, you have broached a bridge too far. I take umbrage at your offense. I challenge you to a duel: six shooters at 20 paces, meet at dawn on the morrow!
jvb
Ahhhh, revolvers are so 19th century! I propose weapons of choice, closing from 800 yards. (Now let me see where I put my Tesla rail gun… It’s here somewhere; the cat keeps moving things…)
Done. Let the fray begin!
jvb
The issue is that it is entirely, wholly, and profoundly different.
Copyright exists specifically to balance human creativity and spreading of ideas with commercial viability. Inspiration from one person to the next is natural and indeed the purpose of art. AI, however, has no inspiration. It is an entirely mechanical synthesis.
When a person reads a story, they connect it to their prior experiences, and interpret in light of their experience. AI, however, predicts the statistically likelihood of one word or sentence following the next.
If a person who is inspired to write a similar story will be approaching it the same way humans have been telling stories for the past million years. The new ideas from the source material will mix with his own in novel ways, and hopefully produces something that inspires others. If, however, the reader strays too closely to the original source material, he may well violate copyright.
AI emulates the statistical patterns of the human author. The statistical patterns of successful authors are inherently more valuable than those of the average person. They are also more relevant than older works that have fallen out of copyright because they reflect modern language.
Training the AI models on copyrighted works means that the output will be mechanically similar to the original authors’ works. Developers often try to mask this by using huge volumes of work to dilute the effect of any one author. However, a prompt by an end user to mimic the style of a particular author means the model will more heavily lean on that author’s raw data. AI output, I do not believe, can be copyrighted, so it puts regurgitated copyrighted material in a sort of limbo.
There is no inspiration, only rote emulation. AI uses valuable data from human authors without necessarily any mechanism to protect or compensate the author. Developers gain commercial advantage by people using their AI model, but the model would be worthless without the training data from human authors who produced quality content.
A very interesting piece that did not go in the direction I expected. I fully anticipated coming down on the side of the plaintiffs in those lawsuits. But the arguments you made are compelling. When I think over the writing I’ve done – a couple hundred thousand words of completely unpublished material – I can easily pick out places where my writing has been influenced a varied array of writers, from J.R.R. Tolkien to Dale Brown. In fact, a yet-to-be-written scene involving a shootout will use an element I remember from a Tom Clancy novel I read years ago.
Plagiarism will always be a no-no, but this line – “All art involves borrowing, adopting, adapting and following the cues and lessons of those who came before.” – is very true.
Maybe writers are the products of their imaginations…summed with their reading experiences. If so, the suits for copyright infringement in this case are frivolous.
I don’t think having an AI build language models based on written works is copyright infringement, because it’s a transformation of the work into a format which is not recognizable as the original. I think it’s possible to make laws against using books as raw data for neural nets, but I don’t know how practical such a law would be to enforce.
From an ethical standpoint, there is a difference between a human learning to write in an author’s style from reading their books and an AI doing the same thing: The AI is artificially created to be millions of times faster than a human. It is similar to asking human athletes to compete against robots, except that in athletic competition it is the process of competition that is the product being delivered, whereas with many creative arts the consumer (usually) values the product more than the process of creating it.
In this sense, authors condemn AI writers for the same reason that human laborers don’t want robots taking their jobs, but I would argue that authors have a more justified case. Unlike an automated assembly line, there is no way to program an AI writer to create the product if someone else hadn’t done years of creative work first.
I think the most important aspect of the problem may be the incentives created by AI. The outcome that people fear is that people will be replaced by artificial superhuman authors, idiot-savant homunculi whose owners do very little work while reaping the rewards of holding copyrights on popular works. Meanwhile, the authors who did the hard work of pioneering those writing styles get crowded out of the market, until nobody has an incentive to come up with new ideas or put any effort into them.
Based on what I’ve seen so far, I’m not yet concerned about the fate of the field of writing. An AI has little enough physical understanding of the world that a human writer is still necessary to make sure that what the AI outputs makes any sense, so it’s a tool in the same vein as pulling random quotes from books for inspiration, if more advanced. I think human authors still have the critical edge for the moment.
What I’m worried about is connecting a sufficiently motivated AI with enough physical and written data to form a model of physical reality. That’s something I think would be worth outlawing and strictly monitoring. When an AI gets good enough to write compelling literature, it’s more likely to be able to figure out how to develop the capability of wiping out humanity.
I suspect this is more ick than anything, knowing your work can be replicated and quite possibly replaced almost instantaneously. As a minister, I wonder about it in my own profession, but I think I have a little bit more job security. Sure, a computer can write and preach, but a large part of my job requires that human factor that AI will never be able to replicate.
But since this is about artist, I suspect they all see the writing on the wall. A while ago, I saw an artist scoff at being paid more than $4,000 to develop a tarot deck, given free rain. The artist said she deserved something like 50k a year for 3 years which included the research behind developing a tarot card and what typically went on the card (so not her own idea). As far as I know, she wasn’t famous and even the famous ones don’t seem to take that long on a project. My point being, why would I pay someone, when AI can just do the job itself for basically free?
Maybe Solomon’s words apply here:
What has been is what will be,
and what has been done is what will be done,
and there is nothing new under the sun.
Is there a thing of which it is said,
“See, this is new”?
It has been already
in the ages before us.
There is no remembrance of former things,
nor will there be any remembrance
of later things yet to be
among those who come after.
Why might training an AI model on copyright protected works be copyright infringement? Because there is at likely at least an unauthorized copying of a substantial portion of the protected work in the training of the AI model.
Copyright does not protect an idea. It doesn’t protect a style or voice, whether it is a literary work, a musical work, a cinematographic work or any other protected type. It protects specific works of (human) authorship – tangibly fixed in a medium of expression. In a literary work, like a short story or a piece of software code, it is the specific text that is protected. It is not the idea of man bites dog or a copy and paste function that is given protection.
The originality bar is relatively low and the ideas expressed in any work are free for anyone to use, reuse, or ignore – whether they are Led Zeppelin or Rush, Cole Porter or Stephen Sondheim.
It seems the commenters here are trying to liken the AI copyright argument to an assertion of some form of trademark protection, where the style of an author is reserved for such an author because that style indicates the source of the work and taking a contrarian view to this position. In other words, the assertion is that the writing style is a signature significance and is unique to the first author/user thereof, hence they should have some rights to it. There is a recognition that that’s not what happens when the imitating author is a human.
Copyright protection doesn’t extend in the way of the assertion and neither do trademark rights. An area of law is closer is the tort of appropriation of likeness or personality. That has not been used for authors in this context, that I am aware, and I don’t think it would work. While it usually relates to use of the personality’s appearance (image) or speaking/singing voice or use of their name for commercial endorsement, it would be interesting to consider whether appropriating a writer’s signature style could be similarly protected. I think it is highly unlikely given that imitation is the greatest form of flattery and has so existed for millenia.
BUT, copyright is a much more viable underlying right in the AI context because the mechanism to train requires a copy of the work. The computer by necessity takes an electronic copy for processing. The copy made is not likely an authorized copy, like one you might make for a backup or to use personally on a different device or at a different time like when using a PVR to record a broadcast. So this is classic copyright infringement assertion.
So while it some circumstances it can be legal to have an AI model emulate the style of someone, an AI model trained using illegal means is like fruit of a poisonous tree.
Here is an Instagram post regarding AI scams:
https://www.instagram.com/reel/C_30_m3uqLp/?utm_source=ig_web_copy_link
jvb