The current debate about AI in content creation is missing the point. We’ve become fixated on surface-level questions: is AI-generated content technically plagiarism? Should we disclose it? (Full disclosure I wrote this with AI). What happens if it hallucinates? (It did, plenty) These are safe debates, legal, qualitative. They are more about the process and the writer (me) than the ethics. For me real concerns run deeper: to the environmental impact of AI, the erosion of human creativity, and the way AI rewires how we think.

AI content tools aren’t inherently unethical. But the scale and speed at which they’re being adopted — without proper oversight — is where the problem lies.

Take the environmental cost. Large language models powering content platforms demand staggering amounts of electricity to train, run, and update. As MIT says (source): “The computational power required to train generative AI models [...] can demand a staggering amount of electricity, which leads to increased carbon dioxide emissions and pressures on the electric grid.” Generative AI may feel frictionless, but it’s not. The friction is simply hidden — outsourced to the planet.

And that’s just the environment.

The labour impact is harder to quantify but just as real. AI isn’t killing writers — it’s displacing their role. The middle of the process, where ideas are sharpened, arguments tested, and voice refined — that’s where AI has been positioned as a shortcut. But that messy middle is where writing lives. Writers aren’t obsolete; they’re essential. We need them now as Stewards of Tone and Truth — shaping, challenging, questioning what the machine returns.

This blog proves it. One draft said it had 427 words — it had 327. Another followed the SEO brief, another thought the keyword was “all the birds,” then it gave me incoherent links... and on and on. Like the worst apprentice... but an apprentice who, beyond that, would also output the rest — well, and to instruction, at speed.

And I do believe AI has a place. A powerful one. When used with intention, it becomes a kind of dialectic. I no longer argue with myself. I outsource the other side. Which means I can reflect faster, iterate sharper, and write better. Think about it — when you play chess with yourself, don’t you always end up compromising? In addition, AI makes it possible for one writer to operate at the scale of a team. It turns ambition into execution.

And we shouldn’t pretend the alternative to AI is carbon-free either. Just as AI hides its friction, so too does the traditional process — buried in drafts, meetings, and waste and staring at screens. And unlike AI, it doesn’t scale insight or anything for that matter.

Remember when all travel was one horsepower?

The opportunity isn’t to reject AI on environmental grounds — though of course, that must be monitored and managed. It’s to demand better. Smarter prompts. Cleaner models. Slower thinking.

We’re still early enough in the story to ask the right question: can we design a version of this that serves both ambition and the planet?

Because in the end, AI slop can only be countered by human regard, human intention.

But none of that matters if we don’t use it well. If we don’t think.

AI lets us accelerate our ambition, but it doesn't remove our need to inform and entertain — we are still the agents of that. It offers reach — but only if we bring rigour. So we need to use it like it matters. Because it does.


Comment