You might think that I’m against using artificial intelligence (AI) or machine learning to write a book because it threatens my work as a professional ghostwriter, but you’d be wrong.
Right now, AI is not impacting demand for my work, though smart clients have added a clause to our agreement that prohibits me from using it. Since I don’t use it, I have no issue agreeing to that stipulation.
However, the number of times I’ve heard someone tell me, “I’m just going to use AI to write my book” is rising quickly as we all become more familiar with it.
When I caution authors against it, I get the distinct impression they think it’s a negotiation tactic designed to persuade them to hire me.
It’s not.
I caution them against it because there is very little upside beyond speed. And for nonfiction authors, there are plenty of reasons they shouldn’t consider it.
Sure, using AI may seem like an efficient way to make serious progress on your book in mere minutes, but there are huge downsides you need to be aware of before you attempt to do that. Granted, I am not an AI expert — at all — yet I know the current problems with its use include:
Lack of copyright protection
One of the big advantages of publishing your ideas in the form of a book is that it grants immediate copyright protection. If anyone tries to claim your ideas or words as their own, you can sue them for damages. When you hold the copyright, you control how your content is used and who has permission.
When you use AI to generate your book, you have no copyright protection because AI-generated content is not copyrightable.
As of March 16, 2023, according to the U.S. Copyright Office, “If a work’s traditional elements of authorship were produced by a machine, the work lacks human authorship and the Office will not register it.”
That should stop you right there. However, there are more reasons to consider on top of the lack of copyright protection.
Potential for plagiarism
As part of getting AI platforms like ChatGPT, Bard, and Claude up to speed, existing books, articles, blog posts, speeches, videos, music, and other human-generated content were used as input into the system. The software was then programmed to make connections between the material it has access to. So, when you ask it questions, it is effectively regurgitating information based on what it has received as input.
It is rehashing other works.
In some cases, that regurgitation is plagiarism. If the AI-generated output is verbatim from something that exists and is protected by copyright, or if the AI tool has rephrased the data without attribution, such as with an article spinner, both can be considered a form of plagiarism.
To avoid plagiarism claims, you must cite the original source. However, that is difficult when AI does not identify it as such.
Using material that was generated based on existing published works opens you up to expensive lawsuits. If your work violates a copyright, you are liable to pay damages that typically start at $10,000. That can get pretty expensive pretty fast. And since you have no way of knowing whether AI has plagiarized, it’s best to steer clear of it for content generation.
Hallucinations
I’ve heard that more recent updates of AI platforms have reduced the incidence of hallucinating, or when AI produces content that is completely fictional, but it’s still important to be aware that it happens.
Recently a classmate of mine used AI to do some background research on their dissertation topic, as a final check to be sure they had found all relevant journal articles. Imagine their astonishment after months of research when ChatGPT provided a list of 11 new journal articles that were exactly on point for their topic. In addition to being surprised, they were also excited to have come across new research they could cite.
However, when they went to read the alleged journal articles, it became clear they did not exist. ChatGPT had made them up.
Fortunately, my classmate took that extra step of checking each and every citation and quickly discovered the research was fake. However, this is a risk you take if you do not double-check everything AI generates. It’s possible quotes provided or statistics cited may be completely made up.
Inaccurate information
Beyond making information up and presenting it as fact, AI can also be wrong. If the information it has been fed is incorrect, it will present it back to you as if it is right when it is not and you may or may not be aware of the error.
This could happen because the platform interpreted satire or sarcasm as truth when it was meant as a joke; just look at The Onion. Or because the website it relied on for information was, pure and simple, wrong.
This could happen with user-generated content that is not reviewed by an outside expert, such as on Quora, Reddit, or Wikipedia. Much of the information is useful – even fascinating – but it is not always right. And AI can pull from those sources just as easily as it draws from an encyclopedia or dictionary.
No new ideas
Finally, if you rely on AI to spot trends or emerging concepts, rather than developing your own ideas, you are using AI as a crutch. It will be more difficult for you to position yourself as a thought leader or expert when you can’t or don’t conceive of your own original ideas.
Similarly, since AI can only make connections between what it has received as input, it is always behind the times. You have access to information as of this second, but AI programming does not occur in real-time.
Yes, AI can be a useful tool for tasks such as organizing information, but be careful if you begin to rely on it too much for idea generation or writing. The content it produces for you may replicate content it has produced for others on the same topic, which, at a minimum, will be flagged as duplicate content by Google. This is bad for your search rank and, worst-case, can open you up to plagiarism claims.
Instead, if you hire a professional ghostwriter, who can work alongside you to help you crystallize and express your unique ideas and perspective, you can be sure that it does not already exist and the chance of plagiarism claims is negligible.