Create a free account, or log in

Five reasons why you shouldn’t rely on ChatGPT to write business copy

Here are five reasons why it might be dangerous for your business to rely on ChatGPT as your new copywriter.
Anthony Caruana
Anthony Caruana
chatgpt professional authors generative AI
Source: shutterstock.

ChatGPT is AI’s version of the Wright Brothers’ first flight. It has completely changed our perception of what technology can do. Suddenly, it’s possible to ask a computer complex, and often very esoteric questions, and receive a response that strongly resembles the work of another person. But it’s an imperfect tool and businesses should not rely on it to write website or blog copy, thought leadership articles or other content.

All artificial intelligence systems work the same way. They are ‘trained’ with a sample of data that they categorise using rules. ChatGPT has been trained with a massive set of data and about 175 billion parameters. It differs from other AI models because it also uses human feedback during its training so the risk of harmful, false, and biased outputs is reduced, although it’s not completely removed. This makes it more accurate than other AI models — but it’s not perfect.

Here are five reasons why it might be dangerous for your business to rely on ChatGPT as your new copywriter:

1. ChatGPT lacks style

While this all sounds great — and it is a massive step forward — we need to remember what ChatGPT can’t do. Because it’s been trained with a specific set of data, the answers it can give are a reflection of that data. So, you could ask ChatGPT to write a sonnet in the style of Shakespeare, but it can not create its own style. If you want your words to sound like they come from your business, are consistent with your brand messaging, tone and identity, you’ll need to write them yourself.

2. Contextual awareness

When you’re writing copy for your website, a blog article or for publication with the media, specific context matters. While ChatGPT has probably “read” more data than any one person could in a lifetime, it doesn’t understand the context of those words, where they have been used or where they may be used in the future. ChatGPT’s understanding of context is based on how frequently words occur close to each other rather than real situational awareness. ChatGPT is not cognisant of shifts in social expectations and may offend sections of the community, leading to your business being embroiled in a controversy that impacts your reputation negatively. 

3. It makes mistakes

ChatGPT may seem all-knowing but it’s not infallible. The Stack Overflow website, which is used by coders to answer questions about programming, has banned answers from ChatGPT because they are often wrong. The big problem, according to the site moderators, is “that while the answers which ChatGPT produces have a high rate of being incorrect, they typically look like they might be good”. While ChatGPT may be useful for initial research, it is not 100% trustworthy. There have been many examples where ChatGPT makes up facts and cites fabricated research. ChatGPT’s “facts” have to be independently cross-referenced to ensure you are not releasing content that is found to be fake, wrong or misleading as this can make your business look incompetent or worse untrustworthy and a proponent of fake news that lead to legal issues. 

4. Jack of all trades…

The saying “Jack of all trades, master of none” applies to ChatGPT. While ChatGPT has an approximate knowledge of many things, it is not a subject matter expert. The people in your business are the experts in specific fields. When writing content to support your business, you will lean into your understanding. This will often go beyond facts. Great writing is a reflection of experience as well as information. ChatGPT can give you facts, with varying degrees of accuracy, but it doesn’t have your experience.

5. Sources and a legal unknown

ChatGPT is based on another AI tool called GPT-3. This was trained with millions of books as well as data from internet databases and other sources. According to its creators, OpenAI, it has learned by reading about 300 billion words. When ChatGPT returns an answer you have no way of knowing what sources it used and, unless you do some follow up, whether its responses are actually original. This could lead to legal issues like those Dall-E, the image processing equivalent of ChatGPT, has faced.

ChatGPT can be a helpful tool. Like Wikipedia and other online resources, it can be a useful starting point for research or to get ideas. But it can’t replace the creativity, awareness and experience that human writers bring.

Anthony Caruana and Kathryn Van Kuyk are co-CEOs of Media-Wize.