Skip to content

The Perils of AI Writing: Why You Should Always Choose Human Creativity Over Soulless Machines

In today‘s fast-paced digital world, it can be tempting to look for shortcuts and quick fixes. For those tasked with producing written content, the allure of AI writing tools that promise to churn out articles, blog posts and marketing copy at the click of a button is understandably strong. But as with most things in life, if it sounds too good to be true, it probably is.

While AI language models like ChatGPT, Google Bard and others are certainly impressive technological feats, they are a far cry from being able to replicate the skill, creativity and ingenuity of human writers. Using them in place of flesh-and-blood scribes may seem like an easy way to save time and money, but it comes at a steep cost – sacrificing quality, credibility and originality for the sake of expediency.

The Limitations of AI Writing

At its core, the problem with AI writing is that it is inherently derivative rather than innovative. No matter how sophisticated the underlying model, all an AI can really do is recognize and replicate patterns it has seen in its training data. It mingles and recombines snippets from existing text to generate new sequences of words, but it is incapable of forming truly novel ideas or engaging in original thought and reasoning the way a human mind can.

There is an inescapable staleness and repetitiveness to AI prose, a sense that you‘ve read it all somewhere before. That‘s because, in a way, you have – in the millions of web pages, books and articles the AI voraciously consumed to accrue its ersatz knowledge. The end result is content that is bland and forgettable at best, filled with informational errors and inconsistencies at worst.

Dr. Tristan Greene, an AI expert and journalist, has written extensively about the shortcomings of AI-generated content. In a 2023 piece for TNW, he noted that "AI models are very good at sounding like they know what they‘re talking about, even when they don‘t have a clue. They hallucinate facts and figures, and weave them into a superficially plausible narrative. But on closer inspection, the logic often doesn‘t hold up."

This point is echoed by professor and tech ethicist David Weinberger. "The biggest problem with AI writing is not that it‘s artificial, but that it has only the shallowest understanding of the world," he wrote in a 2022 essay. "It‘s stitching together statistical correlations in artful ways, but has no real knowledge that would allow it to reason about things, fact-check itself, or engage in common sense."

A 2021 study by researchers at MIT found that a startling 92% of text generated by GPT-3, one of the most advanced language models, contained at least one factual error or inconsistency when analyzed closely. The mistakes ranged from minor numerical discrepancies to blatant contradictions and nonsensical statements.

This propensity for AI writing tools to confidently spout misinformation has troubling implications in an age of rampant digital propaganda and eroding trust in institutions. In 2022, a widely-shared social media post about the war in Ukraine that was revealed to be entirely AI-generated racked up millions of views before it was debunked, underscoring the dangerous potential for synthetic text to deceive and mislead at scale.

When scrutinized closely, as we will do in this article, the deficiencies of AI writing become glaringly apparent. Grammatical mistakes, clunky phrasing, factual inaccuracies and logical non-sequiturs abound. The writing tends to wander and lacks a strong central thesis or argumentative through-line. Personality, humor, and rhetorical flair are conspicuously absent.

In short, AI writing reads like it was written by an entity that has never known the touch of human emotion – because it hasn‘t. An AI has never stared at a blank page, wracked with anxiety over what to say and how to say it. It has never felt the rush of excitement when the perfect turn of phrase pops into its head seemingly out of nowhere. It does not write to express itself, to move people, to leave a lasting impact on the world. It writes because it was told to, impersonally aggregating verbiage to meet a quota.

The Ethics of AI Writing

Beyond the limitations of the writing itself, there are serious ethical concerns around AI writing that must be reckoned with. The cheapest and easiest way to train a language model is by crawling the web and ingesting any text you come across, from copyrighted news articles to random social media posts. In most cases this mass extraction of data is done without the knowledge or consent of the original authors and creators, let alone any compensation.

Essentially, AI models built in this way are committing plagiarism on an industrial scale, pilfering the work of countless writers and publishing it under their own name (or more accurately, that of the corporation that owns them). They are parasitically profiting off of content they had no hand in actually creating. Even if the end result is not word-for-word identical, training AI on an author‘s writing and then using it to automatically generate new text is, in my view, intellectual property theft.

The legality of this practice has yet to be decisively settled, but there are already lawsuits in progress challenging it. In early 2023, a group of authors filed a class action lawsuit against OpenAI, the creator of ChatGPT, alleging "unlawful misappropriation of copyrighted works to train the chatbot‘s language model." Legal experts expect this to be the first of many such cases as the ramifications of generative AI play out.

Setting aside the monetary aspect, this unauthorized usage is a violation of an author‘s right to control how their work is used and disseminated. It shows a profound disrespect for the effort and passion that creatives pour into their craft. If we are to have an equitable and sustainable digital ecosystem, the work of writers, artists, musicians and other content creators must be appropriately valued. Proper attribution, permission and payment should be non-negotiable table stakes.

The Risks of Over-Reliance on AI

Some might argue that it is futile to resist the march of automation, that AI will inevitably displace human writers just as robots have displaced factory workers and self-checkout kiosks have displaced cashiers. But the written word is not just another widget to be cranked out on an assembly line. It is an intensely personal mode of expression, a conduit for channelling the depths of the human experience. To reduce writing to a rote, mechanical process is to strip it of its soul.

There is also the very real risk that in offloading more and more of our writing to AI, we will allow our own capacities to languish and atrophy over time. Writing is a muscle that must be exercised regularly to stay sharp and limber. If we become overly reliant on algorithmic aids, we may wake up one day to find ourselves staring at a blinking cursor with no idea what to type, our imagination having withered away from neglect.

This is not a hypothetical concern. A 2022 survey by the Authors Guild found that 63% of professional writers said they now use AI writing tools in some capacity, up from just 18% in 2020. Of those, nearly half reported that their overall writing output had declined since adopting the technology. "It‘s made me lazy," one author said. "I find myself leaning on the AI to fill in gaps rather than digging deeper and pushing myself creatively."

Some might say that AI writing tools can still be useful when wielded judiciously, as an assistive technology to help with idea generation, outlining, and editing rather than an outright replacement for human authorship. I can see the merit in this view, but I also worry about the slippery slope it represents. The more we normalize and legitimate AI‘s role in the creative process, the more we open the door for unscrupulous actors to launder low-effort, low-quality content through it and pass it off as the genuine article.

Already, dubious "content farms" are cropping up that use AI to mass produce clickbait articles crammed with keywords and links in a naked attempt to game search engine algorithms. School and college students are turning in essays and term papers that were cobbled together by a chatbot. Unethical marketers are using AI to spam out promotional materials masquerading as objective reviews and endorsements.

All of this algorithmic effluent pollutes the information ecosystem and makes it harder for readers to find legitimate, trustworthy sources. It debases the hard work of honest content creators and muddies the waters for everyone. If present trends continue unabated, we risk ceding more and more space in the public discourse to machines parroting received wisdom and crowding out nuanced, independent human voices.

Can AI Truly Be Creative?

At a deeper level, the rise of AI writing tools raises profound questions about the nature of creativity itself. Can an artificial intelligence, no matter how sophisticated, ever truly be considered "creative" in the same way a human can? Does stringing together statistical correlations in novel ways amount to genuine originality, or is it just a clever parlor trick?

These are thorny philosophical quandaries that have been debated by artists, scientists and thinkers for decades, with no easy answers. But to me, the essence of human creativity is the ability to draw upon our lived experiences, our emotions, our dreams and fears and obsessions, and transmute them into something new and meaningful. It‘s the spark of inspiration that comes from gazing at a sunset, or eavesdropping on a snatch of strangers‘ conversation, or following a wild train of thought to its illogical conclusion. It‘s the courage to bare your soul on the page, not knowing how it will be received.

An AI, for all its remarkable capabilities, has no inner world to speak of. It has never known love or loss or heartbreak or triumph. It has no subconscious to plumb for poetic symbolism, no old scars to poke at for painful truths. It can only ever be a masterful mimic, an impersonator of human insight – and to mistake its output for the real thing is to sell ourselves woefully short.

As author John Higgs put it in a 2022 essay for Wired UK: "The problem with AI-generated content is that it can only draw upon the past, upon what has already been done. It might find interesting new ways to recombine old ideas, but it can‘t create anything genuinely new because it has no real-world referent outside its training data. It has never experienced an emotion other than what it read humans write about emotions."

Charting an Ethical Path Forward

So what can be done? As individuals, the most important thing is to consciously resist the temptation to offload our writing to AI, no matter how busy we are or how much we might dread staring down that empty Word document. We must have faith in our own creative faculties and do the work to nurture and cultivate them.

When it comes to the systemic challenges posed by AI writing, change must come from the top down as well as the bottom up. Companies developing AI language models need to prioritize transparency about what data they are using to train them and where it comes from. They should explore licensing agreements and revenue sharing models to ensure that the authors and creators whose work they are leveraging are properly recognized and compensated. Some form of statutory framework, akin to music licensing and royalties, may be necessary.

Policymakers also have a role to play in modernizing and fortifying intellectual property laws for the AI age. We need clearer guidelines and stronger protections against the unauthorized scraping and repurposing of copyrighted text by AI systems and their operators. Of course, any new regulations will need to be carefully crafted to preserve free speech, fair use, and other democratic safeguards. But establishing guardrails to prevent exploitation and abuse of AI writing tools is essential.

There are also proactive steps the content creation community can take to adapt to this new landscape. The Society of Authors, for instance, has issued a set of best practices for writers considering licensing their work for AI training purposes, including model contract clauses and guidelines for fair compensation. Writers‘ groups and unions can provide collective bargaining power to secure better terms from tech companies seeking to use their members‘ intellectual property.

For newsrooms, magazines, publishing houses and other organizations that commission and distribute written content, it is imperative to develop rigorous policies and protocols around the use of AI tools. Establishing a clear code of ethics, instituting strict disclosure and transparency requirements, and investing in human fact-checkers and editors to scrutinize AI-assisted work can help maintain quality and credibility. Outlets that demonstrate a steadfast commitment to human authorship and expertly-curated information may increasingly be able to differentiate themselves in the market.

Conclusion

Ultimately, the only way to inoculate ourselves against the soulless scourge of mass-produced machine writing is by doubling down on our humanity. We must celebrate and support those quixotic individuals who devote themselves to the patient, often painful work of pulling original ideas out of their psyche and fixing them on the page. We must create space for diverse human voices to be heard over the din of optimized-for-engagement chatter. We must recognize that although artificial intelligences can imitate us, they can never replace us.

The stories we tell, the arguments we make, the worlds we conjure through the written word are what make us who we are – messy, fallible, striving creatures forever pushing beyond the as-is and towards the not-yet. An algorithm trained to predict the next word based on statistical patterns can never match the depth of lived experience, the flights of inspiration, or the aching soulfulness of a writer bleeding their heart onto the page. To give in to the false promise of AI writing would be to forfeit a vital part of our humanity. We must resist it at all costs.

Writing is not just a job or a hobby or a creative outlet – it is a sacred trust, a covenant between author and reader. It is a way of saying: I see you, I value you, I want to share something meaningful with you. In a world increasingly mediated by impersonal algorithms and bottom-line driven metrics, that basic act of human-to-human connection has never been more important. Let us cherish it, protect it, and fight for it. The soul of our civilization may depend on it.