To AI or not to AI...

Jon Goff  
AI is here, and it's not going away. There's a tremendous amount of anxiety in the community of creatives, from artists to writers and musicians who are afraid it will harm their industry and craft. With 35 years background in computers and 15 years as a professional columnist for the same industry, as well as ten years in the self-publishing business, I have a different take on AI. It's probably controversial, but the reality is that AI is here, and it's not going away, so now what?

Artificial intelligence is here. The genie, as they say, is out of the bottle. And while there is a lot of discussion, even outrage over what some consider to be the “theft” involved in Artificial Intelligence, is it any different from how humans learn? We may not learn as fast as a computer, but we learn by looking at what others have done, and by trying to do the same thing ourselves, over and over, learning from our mistakes. This is essentially how large language models “learn” to do things. And AI is not going away. The genie IS out of the bottle. He’s not going back in, so the question isn’t whether we should use artificial intelligence, but how we should use it.

Let's begin with the simple scenario. Taking a piece of art you own and upscaling it. I have aphantasia, which simply means, I don't see things in my mind's eye. If you say, "think of an apple," I don't see an apple. There is no image formed in my mind. I think of an apple. I think of its color, its shape, but I don't see or even imagine these aspects visually, but as concepts. Conceptually, I know what it looks like, but I don't see an apple, so when I'm trying to visualize what a character or place looks like, it helps me to draw it. Then, and only then can I see it.

Most people can see images in their mind, and when I learned this, I was floored. I'd always thought the phrase, the "mind's eye" was allegorical. It never occurred to me people actually saw things in their mind's eye. It turns out they do, and that still seems strange to me. Regardless, in order to visualize a place or a person, I have to draw it.


Here is a drawing I made of the lighthouse at the Port of Jarick in The Rune that Binds. And next to it is the same image after I uploaded my drawing into AI and gave it a prompt. I think most people would be okay with the use of AI for something like this. You can clearly see how similar the two pictures are, and AI simply cleaned up my drawing.

In this instance, I used AI rather like a spell check for art. It fixed my lines, corrected my perspective, and gave me something more in line with what would have taken me a long time.

Now, you might say, well, that's all well and good for art, but what about writing?

Let’s begin by looking at the pros and cons of using AI for writing:

• It’s efficient and quick. The most obvious advantage of using AI for writing is the speed and efficiency with which it can generate content. AI can process vast amounts of data quickly and produce coherent and contextually relevant text, saving writers time and effort, and allowing them to be more productive.

• It is an antidote to writer’s block and can help generate ideas, brainstorming concepts, different narrative possibilities. It can provide prompts, suggestions, and creative insights, acting as a tool to spark inspiration and overcome writer’s block.

• AI has the ability to more easily maintain consistency in writing style, tone, and language throughout a particular piece of work. It does this while maintaining a high level of grammatical and spelling accuracy, as well as a coherent structure. As such, it can help writers maintain a consistent voice while adhering to specific guidelines or requirements.

• It excels in creating content that is accessible and inclusive. While some writers may balk at the current trend of DEI, others embrace it, and AI-powered writing tools can help writers meet their accessibility and inclusivity goals.

And because many AI tools allow for the user to rephrase and rewrite their suggestions to address gaps in their original query, it makes editing and fixing content fast and easy. AI can fine tune an article or piece of fiction with consistency, speed, and precision; and do it in a fraction of the time. AI can be customized to suit individual preferences and requirements, allowing writers to tailor the tool to their specific needs. Whether it’s adjusting the writing style, incorporating specific themes, or adapting to different genres, AI can provide personalized support to writers.

But it’s not all roses and rainbows. There are issues with AI writing tools that must be acknowledged. While AI may be able to write a 500-word essay, it is, at best, a rough draft. Perhaps more polished than some writers will produce on their first try, and certainly not as quickly as AI, but it is far from perfect. Here are some of its failings.

• One of the primary concerns regarding AI-generated content is what many consider its lack of authenticity and originality. AI can mimic human writing styles and patterns, but it struggles to produce genuinely unique and innovative ideas. Because of how it is trained, it can imitate, but it lacks originality, both of thought and expression.

• There exists a danger for writers who become overly dependent and reliant on AI. This can stifle creativity and even compromise a writer’s development. Because AI has a perfect understanding of grammar, punctuation, and spelling, it will do all of these things easily, but by relying on this perfection, a writer never develops these essential writing skills. Further, there are skills that it mimics, such as pacing, tone, setting, and so on that a writer must learn, but might not because they become overly dependent on AI tools. They lose their ability to think critically, or experiment with different writing techniques. In short, they never learn to hone their craft. Think of it like a cook who loves food but uses microwaveable dinners whenever they want to eat. This prepackaged, pre-seasoned food may fill the belly, but the aspiring chef never learns to do anything but push buttons.

• And because this is new technology, there are concerns about the ethical and legal issues of using AI in your writing. This is especially true when we think about copyright infringement and intellectual property rights. Is the AI truly writing, or taking snippets from all the things it’s scanned, piecing them together like a patchwork quilt? Who owns these snippets? How do we even determine ownership and attribution for AI-generated content? All this leads to a tangled gordian knot of potentially expensive legal disputes and complications regarding who owns what, and what is owed to whom.

• And because AI models are trained on existing data, much of which is in the public domain, and which was written in eras with different standards and biases, the content it produces may contain biases and stereotypes that are offensive to some. There is some concern that AI-generated content may inadvertently perpetuate or reinforce these biases.

• And lastly, as any writer can tell you, writing is a deeply personal experience. We weep, we laugh, and we mourn with our characters. Because it doesn’t involve the parts of the brain where imagination and creativity are centered, it is not as fulfilling or rewarding as writing from our own experience. There is a kind of Zen, a focus that comes when one opens a book to research a fact, or interviews an expert, which is not present when creating content with AI. I think it creates a barrier between writers and readers. When we write, we tap into and expose a part of ourselves to strangers. The use of AI in writing may diminish that human connection and the emotional resonance that comes from sharing authentic, heartfelt stories and experiences.

For example, 63% of the blue text was written by AI, based on this prompt: Write a 500-word essay on the pros and cons of using AI like Chat GPT to write stories or books.

ChatGPT produced 619 words, which I then went over and edited for clarity, adding about 300 words, mostly in the form of metaphors. While I edited and added some personal preferences for word choice, and provided the prompt, I didn't write the article. These aren't my ideas. I did no research. I just cleaned up the text ChatGPT generated. I made it flow a little more smoothly, and dropped in a colorful metaphor here and there, and that’s it. I felt more like an editor than a writer, and in fact, that's what I was.

More importantly, I didn’t enjoy writing it, because I didn't write it, I edited it (say that 67,420 times fast). I produced a 987-word article in maybe 30 minutes, with no research or real effort on my part. The bulk of which was created in less than a minute.

What I added made it better, but it still doesn’t feel like mine, like something I wrote, and I think that's because I didn't do the research, I didn't ponder over how to best express my views. I didn't even have to set my personal biases aside to create it.

Would I be comfortable submitting this for publication? Well, it is going up here on the CC blog, but since I created it specifically to include it in the blog as irony, yeah... but not if I were creating it for the sole purpose of passing it off as my own.

I can spot AI written articles like this from a mile away. They're flat, and uninteresting to read. Would I publish something like this? No. It’s not my article. It’s not my story. These are not my ideas. I didn’t think about any of this before writing it. I didn’t talk to a living person to gather their thoughts and feelings. I didn’t plant the seed of the idea in my brain and let it germinate. All I did was type a single sentence into a machine and got… something… back. Something that was admittedly legible, coherent, and grammatically sound, without a single misspelling or typo, which is better than I can do on my first try. It's serviceable, but it's also something that has no piece of me in it, even after my edits.

So... AI. It's here. It's not going away, and it WILL be abused. But it's also a tool, and like the original piece of art I drew of the lighthouse, coaxing it out of my brain's non-visual liabilities, on which I labored long and hard, erasing and redrawing, and erasing again and again until what I had on my tablet matched what I described in the book, it came from me. I never saw it in my mind's eye, because I'm blind in that aspect, but when I finally finished the drawing, I recognized it. And it came from me. It's something I made. All the AI did was clean it up. I think that if we put ourselves into our work before we put our work into AI, instead of the other way around, we can make great things, things with soul and heart, and humanity. Things readers can connect with.

My point? Don't be afraid of it. Don't hate it. Don't hate people who use it. But use it wisely, ethically, and put yourself into your work before you put your work into AI. By its very nature, what happens to AI and how it evolves is entirely up to us.

19+ Comments

Vidyut

Interesting subject and one that keeps popping up here often. And it will, since it is a tool that exists and has a lot of potential to help writers.

The misbelief that AI is laziness persists, no matter the efforts to educate the masses on its use. It couldn’t be further from the truth. It is quite a learning curve. You bring up two use cases. But really there are loads more.

Good at storytelling in general, but find some bits tedious? Delegate.

It can also cut down research time (and distractions) dramatically. If you need twenty kinds of butterflies a dead victim of a mystery novel might have collected, somebody like me is going to go down a rabbit hole of butterflies for weeks and end up knowing a LOT about them if I do my research myself. And I really don’t want to. The man is already dead, and I just need some informative detail to turn into a subtle but intellectually interesting clue. I can tell ChatGPT to tell me seven common species of butterflies in this location and in this season and two really rare ones that are found only in a different location along with their detailed identifying characteristics that an interested observer might notice. And it will. And in ten minutes (to allow for refinements and a coffee) have a brilliant foreshadowing clue without spending ages learning about butterflies enough to be confident about their geographic distribution. It is all I really wanted for a character to spot that two butterflies aren’t local, but with the depth that makes readers who know butterflies feel special or go “oh shit” of course.

For people who switch between first and third person, the fastest way to do it without leaving errors and inconsistencies could be to ask ChatGPT to do it for you. Write a detailed enough prompt, and you’ll be able to alter the whole perception, not just the pronoun and/or tense. “Father” might become “Name”. “Sweating palms” might turn into a “nervous look”. etc.

It could be asked to suggest rewrites whenever we fall into some usual bad writing habit and that suggestion could be anything from “go write this again” to “let me fix this for you”.

It’s a tool. Fair to never use it, but it isn’t after our job or craft. If our work can be easily replaced by a tool, it might be more useful to consider how we could be more creative rather than resent the tool, because long before ChatGPT, a thousand mediocre writers already sounded the same.

There are a lot of questions and opportunities opened for all sorts of artists every time a new tool comes along. Writing technically correct language was harder before browser plugins back-seat correcting us and spellcheckers putting squiggly lines under words. Sure, there are negatives - language itself has started feeling like it lives in a box and is less quirky. Finding “own voice” is harder if our voice is like everybody else’s. AI is another change to navigate through.

Understanding how AI works will quickly matter to those wanting to do submissions. And it doesn’t have to be a bad thing. I wouldn’t be surprised if major publications already had AI trained to assess submissions based on specific things they wanted in stories, and unlike human editors, it would read your whole manuscript for the traits it was selecting for, instead of giving up on you if the opening didn’t hook. Every submission getting read would be astonishing to begin with. Even if a story wasn’t selected, it might be able to return specific suggestions to the author for what they wanted in a manuscript for future submissions.

But wait. AI could be trained to do submissions for you, finding publishers who’d be most likely to be interested, coming up with perfect comps, tailoring introductory emails to fit each publisher’s or agent’s profile. And it could track the submissions and do any follow ups needed, so you aren’t lost in a pile somewhere.

But can an AI do what an author can? Not yet. And it will be a while before it will be able to. And if it can produce great books, we’d be fools not to read them just because the AI isn’t human. Can it churn out mediocre books to keep spammers in business on Amazon? Sure it can. There have also been instances of non-fiction books with incorrect information being published. But then humans were doing this a long time before AI showed up, and it is still humans using them to produce marginally better spam than they used to.

Would it be perfect?? No. It would be different. It might open some opportunities, and close others. We don’t know. But that’s the fun part of life. Finding out what the future holds.

Jun-10 at 01:22

Luluo

lol, I dunno. It has outright lied to me multiple times. And when I catch it, it’s basically like, Whoops! Ya got me!

Considering how often I’ve caught it inventing falsehoods about dirigibles and 19th century trains, I’d personally double-check its butterfly facts! I still occasionally use it when I’m at a point in research where I’m stuck, or am unsure where to start, but I’d never take its word on anything.

Jun-10 at 01:35

Vidyut

Oh yes, totally. The (presumably well trained) AI is only as good as the information it has and the context window. I do mention later that non-fiction AI books have had incorrect information. Such stuff could even be lethal - for example getting the identification of mushrooms wrong.

Yeah, I’d definitely verify the information, but it is a quick shortcut to knowing specific things to verify.

Jun-10 at 01:46

Zznewell

I would love to use AI. But first the copyright issues need to be resolved. Training data for the AI should only be used with the permission of the copyright holder and the art or writing produced should be copyright protected. Until these legal issues are resolved, I’ll steer clear of it.

Jun-10 at 02:33

Trevose

It’s not capable of lying. It’s can be wrong and present it’s misinformation in an assertive way, of course.

Jun-10 at 03:25

Lvocem

Replace the well marketed name AI by My cousin Guido. My cousin Guido can write a story really fast. But he has no ego. So you can tell him to write something for you and he will do it and hand it to you. You read it and realize that the guy is pure cheese. It needs to be edited. So you tell my cousin Guido to make some changes, give it some emotion, dare a little. He does. The stuff he produces is just astounding.
So here’s were I draw the line. Let’s say I am promoting a business. You never see the name of the writer. It’s all benefits, ideas, and a product. So My cousin Guido’s work is awesome. I load it up. We make millions of dollars selling that product. Life is good.
Let’s say I upload that material under my name. But I did not create it. My cousin Guido did. Yet I am taking the credit. Sure, I made some edits, gave him direction, so why not. In my book that is still call plagiarism. And I will fight plagiarism to the end.
Even in history, when we came up with a threat. the dereagotype. Who needs a perfect image? So all those neoclassic artist went out the window by the imperfection of the human spirit, and then you got the impressionist. post-impressionist, and the rest is history.
We will make history again. But using someone else’s work as yours, I still call it cheating.

Jun-10 at 03:40

Radprogirl

It’s serviceable, but it’s also something that has no piece of me in it, even after my edits.

This is why I find AI does not work for me to create narrative. Even if I wrote something and put it into an AI program to fix up, it changes my vision way too much. Maybe AI will get to the point that it can co-author effectively, but you can never use AI and it not be a collaborative piece. Even with your example of the picture that’s not fully your work anymore after the AI polished it up. If you had taken that picture to a human artist and they made the same changes, could you really say it was solely your creation anymore? Not really.

Is this a bad thing? That depends on the needs of the author and they way they use it. I see nothing ethically wrong with you putting that enhanced picture in your novel. I wouldn’t even have issue with you not stating it was enhanced by AI. “The Rune that Binds” is not an art book. The art would be their to supplement the story, nothing more. Now if you made an art book based on your series and filled it with pictures like that, it would be more of a gray area. One could argue creating such a book with such pictures is robbing artists of work. Then again, when we use a grammar checker to edit a story are we robbing proofreaders of work ( could be a bad example, but you get what I am saying)?

I do think there will be talented writers who will choose one day to co-author with AI and that will work for them. For me though, I’m being robbed of the journey when I try to use AI this way. So with narrative creation, as frustrating slow as I am getting the words on the page, I will not be using AI for that.

However, I do think AI is useful in the brainstorming stage. It also can be useful for early critique as long as you don’t use it as a gauge on how well you are doing as a writer or how well the piece is coming across. AI criticism is much better at pointing out specific things you might be looking for in the text: voice consistency, over use of words, plot consistency, etc. AI can’t like or dislike your work. It can’t give you a reaction to a character arc or a thrilling scene. It will say a scene is touching or exciting, but these are just the words the calculatior is tossing up there because those words are used in critique. It cannot experience a story as a human does.

If AI ever does get to that point, then we have other problems to worry about. :robot:

Do wonder if it is ethical to use AI to help write something like a song for novel, like Tolkien did in LOTRs. Modern authors will do this too, but he is the easiest example that comes to mind. One of my past WIPs had a character who wrote songs and his song writing was a plot point in the book. I’m an okay poet, but song lyrics and poetry aren’t quite the same thing. So my expert song writer always created mediocre songs. I wouldn’t put the songs I would use AI to enhance on Spotify. They would only be a few lines in a book. Nonetheless, would that be ethically wrong, even if I can’t afford a lyricist? I don’t know. It’s not an issue I have to worry about now. That story is currently languishing on one of my hard drives and likely if I ever return to it I’m going to do a complete rewrite of the story. Still, I do wonder.

Excellent blog. It got me thinking.

Jun-10 at 05:20

Radprogirl

So, how do you feel about ghostwriting then? If your cousin Guido doesn’t want to or is incapable of claiming ownership, is it really stealing? What about public domain stories? The book that is soon to be a movie musical “Wicked” was not plagiarism of “The Wizard of Oz” nor was the Wicked Witch character from the ABC show "Once Upon a Time. "

I do see the ethnically dark gray area you are pointing out in using AI to create narrative. However I don’t think it is as simple as: Idea not mine, therefore plagiarism. It is more nuanced than that.

Jun-10 at 05:50

Rxd01

I’ve found Bing is less likely to be incorrect than GPT, and Bing cites it’s sources, so it’s pretty good for research.

Jun-10 at 06:34

Sando

Aspiring AI users should also note - pretty much every site I’ve submitted short stories to lately has a little box saying basically “I didn’t use AI to write this”, and won’t accept AI texts.

The problem is that reading submissions and judging them is a laborious, human expert task. Most small publications already take months to reply to queries, how long will that take when people are wading through piles of AI stuff?

No doubt someone will now suggest using AI to read submissions :grin:.

I say this as a software developer who recently used two different AI trained models to accurately read and process hundreds of thousands of old fuel receipts for his company. The technology is useful, it’s not going away, but it will create harm as well as good, especially in an industry as fragile as publishing.

Jun-10 at 06:35

Li1991

There is a line in my very first novel ever self-published that I cringe at every time I come to. It’s not necessarily a bad line, and I don’t think anyone else would think it out of place at all - but it’s paraphrased from a critter’s suggestion, and although I must have thought the suggestion was okay at the time, every time I look at it all I can think is that left to my own devices I would have never written that line. There’s no me in it.

If I’m that unhappy with a single line that came (mostly) from elsewhere, I can’t imagine me being satisfied with having anything AI wrote in a book of mine. As for just plotting/outlining/what happens next - that’s the stuff I want to do.

The only thing I really want to use AI for (aside from spell/grammar check) is book covers. They are either expensive or time-consuming (sometimes both!) and I hate that they can make-or-break a book’s success. With AI, I could change my covers quickly and cheaply if they weren’t working. Right now, I’m not doing it, because I’m not sure about the legal stuff, but if it is deemed an acceptable practice, I’ll be getting into it.

Jun-10 at 08:44

Marisaw

This would be my issue, too.

I recall someone suggesting that you could use AI to write your rough first draft, then revise it to produce the final version. However, I think once you have some words on the page, it’s restricting. You’re likely to be influenced by the words already on the page.

That’s why some people write their first draft, then put it aside and write the whole novel again from scratch – so that they’re not influenced by what they’ve already written.

Jun-10 at 09:03

Goldspoon

…did not know her name, but he knew that she worked in the Fiction Department. Presumably – since he had sometimes seen her with oily hands and carrying a spanner she had some mechanical job on one of the novel-writing machines.
George Orwell - 1984

Jun-10 at 09:31

Luluo

Thanks! I’ll look into that.

Jun-10 at 11:30

Andy_jacob

There’s talk that even now the agents and editors use the AI to sift through the slush piles. It’s easy and no one would be the wiser, actually. Once we send a submission to an agent, there’s no telling what happens to it. It’s only a rumor, but why wouldn’t they?

On that note, giving any sort of leeway to the book industry in terms of AI will be the slow and painful death of paid creativity. The book industry will slowly shift from looking for interesting voices (which is already getting very thin in the process of appeasing the shareholders and finding more popular mush to sell) to producing AI crap en masse. If money is the only driving force (and we can assume it is, since I’m more than sure “smile of a shareholder” is the main concern at every corporate HQ), we can safely assume that publishing companies would start cutting out authors for bigger profits. Even if they paid billions for AI, they would quickly turn a profit.

Jun-10 at 11:32

Rxd01

Why do that when you can license the model? If you can train a generative AI on your authors’ work to the point it can produce something people would want to read, then you sell it as a subscription where people can produce their own custom stories.
AKA “I want a kid’s bedtime story about Jane and Bobby saving the world from evil Superman in the style of Stephen King.”

Of course the publishers need to have in their contracts that authors’ work can be used to train their AI models. Or to get new “authors” in to write copy to train the AI models on.

Jun-10 at 11:41

Glitterpen

Such a can of worms. I can’t ignore this blog…

is it any different from how humans learn?

From what I’ve read, AI software can hold massive amounts of information, far greater than the human mind. I don’t believe they’re the same as a human mind. It’s the memory that gives it such a edge.

And AI is not going away. The genie IS out of the bottle.

I think this is something huge companies like to say so that they can pay their employees even less for fixing something AI made, or so they can replace humans altogether. It’s the workers who suffer while a small group of people get rich. It may not go away, but regulations could be put in place if people started to care more about human beings than AI. Organizations like Concept Art Association (I think that was the name), are trying to lobby for artists’ rights. As an artist, I pray they can do something.

I have aphantasia, which simply means, I don’t see things in my mind’s eye.

This doesn’t change the fact that AI trained on a huge database (5 billion scraped artworks and photographs). Some of the artists who created these images eked out an honest living despite health issues like schizophrenia, depression, arthritis, chronic pain, and other difficulties.

And there’s such a thing as a perspective grid, that you can set up on a page with simple lines (mind’s eye not needed, so I don’t feel sorry for you).

None of the people who made the images that were trained on were compensated, as far as I know. Some of the AI companies even used names of artists working today as ‘style selections.’ I find that horrible.

AI. It’s here. It’s not going away, and it WILL be abused.

Yes. Human artists now have to share the Internet with ‘pretenders’ who can flood art sites with easy to make derivatives (AI can create an image within minutes). Does anyone know what that feels like?

When I was on Twitter (early last year), someone asked me what AI software I used to make my art (which I made by hand, spending up to thirty hours on some of my artworks). So the work of human artists is now called into question every time anyone sees an artwork (though some AI images are easier to spot than others). I find this so sad.

I never saw it in my mind’s eye, because I’m blind in that aspect, but when I finally finished the drawing, I recognized it. And it came from me.

Again, you could have set up perspective lines, in a layer behind the sketch layer.

Jun-10 at 11:56

Lvocem

Excellent points. If it’s already public domain is one thing. As for ghostwriting, usually the book will say “with” So and so. So there’s some type of credit given.

I find this deplorable. And we are already a victim of this problem. When you apply for a job, most HR departments run the resumes through a program. So in the end you become a sum of your keywords. If you phrased them the way the wanted it, match. If you did not, bye bye. I find that to se a sad state of affairs. So for instance, let’s say you were a designer, bye bye. Now they want an UX/UI expert. And if your definitions do not fit what they call things, you are shit out of luck. The same will apply to the stories that pass through the filter.

Jun-10 at 11:58

Radprogirl

Actually, I had a friend who was a ghostwriter for several different weekly column writers. These writers would use the articles he wrote for them to fill in the gaps between their work. He never got credit. At the time, it apparently was a very common practice for columists to use ghostwriters to keep up with their schedule. He made quite a bit of money, enjoyed the work, and didn’t mind not getting credit. I do wonder if all that work dried up for him when Chat-GPT came around. I hope not, but I don’t know because I lost contact with him years ago.

Then again column writing is different than novel writing. You have a point too. :wink:

Jun-10 at 13:06
Click here to reply
Member submitted content is © individual members.
Other material ©2003-2024 critiquecircle.com