I felt surprised when, in my "other life" as the managing editor of an academic journal, the subject of ChatGPT came up on the agenda of an editorial board meeting. Before that moment, it hadn't occurred to me that the ability of AI to generate content had progressed to the point where it was a problem or something that we needed to make a policy about.
I knew, of course, that AI has been being developed for some time and also know about many jobs being lost to automation, but I had
felt “safe” from its intrusion into my work. While I understand that my phone has been able to find answers to basic questions for some years, it doesn't come up with its own ideas. I assumed we were far away from AI doing creative work well.
Can AI tell
a story? What if I gave it very few guidelines to write a picture book – just the
commonly recommended word limit of 500 words and that it be appropriate for a
child to hear and a subject. I gave it the subject of a dragon, because my favorite of my current
picture book manuscripts is about a dragon. What would it come up with? Can it do my work better than I can yet?
It made a surprisingly coherent
story, with all the basic parts of a story and a nice arc. It did it quickly - more quickly than I can write and with a clearer plot than many first drafts I've written. If anything, it
felt a little old fashioned, beginning with “Once upon a time” and ending with “The
End.”
Could it revise, though? I thought about what I would
have done differently if I had written this story, and it occurred to me that I
would probably have used some dialogue. So, I asked it to re-tell it using
dialogue. It was able to keep the structure of the story and have Daisy the
dragon and the people saying a few things, but it didn’t really make the story
better or sound natural (although it’s not as if every revision attempt I make necessarily
improves a story either).
I was curious what kind of input it had to be able to
come up with a story like this. Was it very similar to a book it had "read?" Would it be able
to identify where it had gotten this form of story from? So, I asked it “Which
books do you know that are similar to this story?"
It gave a response including five books and their authors and summaries. All the authors were real, but none of the books seemed to exist.
And that’s when I realized that AI sometimes just
lies.
When I asked if it was sure they existed, it apologized,
admitting that “it seems that some of the books I mentioned may not exist” and
listing five that do.
So, I guess the lesson is that you might be able to use it to generate some comps for something you're writing, make sure that the books the AI finds actually exist...
What if it didn’t have the topic of a dragon? What if
I asked it to write a picture book manuscript instead?
This time it separated the story by pages, and wrote a
story about a seed that had a dream to grow into a strong tree and eventually
did. It was kind of sweet.
So, what does this mean for me? For us as writers? In short, I don’t know. I don’t like the feeling that I could be replaced by a machine, and for now I don’t think I can. Reading the stories generated by ChatGPT, my guess is that it can write in various "styles" but that it is missing something when it comes to the elusive concept of "voice."
What do you think? How close is ChatGPT (or other AI software) to being able to do what we do? How can it help us or how can it hurt us? Please share your thoughts in the comments!
I also just noticed that this topic will be discussed at an upcoming virtual Shop Talk for those of us who are interested.