Digital Economy Dispatch #256 -- How Much of That Did You Write?

I’ve picked up an annoying new habit lately. Every time I read a new report, article, or news item, I find my mind drifting to the same question: how much of that did you actually write? It’s become an obvious question born of experience. Today, I simply assume AI is involved in every piece of content I encounter. In meetings and workshops I attend, it’s become second nature to pause and ask whether this is original work, a clever remix, or the output of a well-tuned language model. And lately, I’m not sure I can tell the difference.

Yet the more I dwell on this, the more I realise that I may well be asking the wrong question entirely.

From Authorship to Application: What Really Matters?

These days, AI is everywhere. It is shaping newsletters, policy documents, press releases, and even those “personal” updates we get from industry luminaries and corporate leaders. The genie is out of the bottle, and if you’re expecting a handwritten, human-only narrative, you’re in the wrong era.

There’s a good chance everything you see has been touched by AI. So, in this context, asking if AI has been used is meaningless. The real challenge today isn’t figuring out whether AI wrote something, but understanding how well AI has been used, and what that means for the value and credibility of what we’re reading.

Now, when I reflect on any piece of content I’m reading, my concerns shift. Was AI used thoughtfully to summarise, synthesise, and clarify? Was human judgement applied to curate sources, verify facts, and ensure the final piece offers insight or a meaningful perspective? Did the process simply automate the bland, superficial, and lowest-common denominator view…or has it added something new, unusual, or unexpected?

Provenance, Accuracy, and Context: The Three Pillars

This realisation leads me to three guiding principles that I now always keep front-of-mind when reading or reviewing content: provenance, accuracy, and context.

  • Provenance: Who stands behind the writing and what’s their reason for creating it? What methods, sources, and tools were used and why? Transparency of authorship now counts more than ever, whether it’s human, hybrid, or wholly machine-driven.

  • Accuracy: How has the content been validated? Are the references solid, the claims substantiated, the figures real? In the AI era, it’s easy for plausible nonsense to slip into even authoritative-looking work, so verification rises to the top of my checklist.

  • Context: How does this piece fit into the wider picture of what’s happening? Does it echo established research, contribute new value, or simply repeat the latest trend? With so much regurgitated material, judgement means going beyond the words themselves to assess the motivations, viewpoints, and environment that produced them.

Lessons for Leaders and Decision Makers

These shifts carry real implications for anyone responsible for strategy, governance, or digital transformation. In this AI era, you now bear additional responsibilities for every piece of content you consume, produce, or refer to in your work. Here are a few thoughts on how to make sure that you’re up to the task.

1. Don’t Fixate on the Tool: Assess the Output
It doesn’t matter whether a human, robot, or committee wrote what you’re reading. What matters is its clarity, relevance, and reliability. Make your judgements based on substance, not origin.

2. Demand Transparent Sourcing
Push your teams to be explicit about how content is created and where information comes from. Ask for clear distinctions between AI-generated material, human analysis, and authoritative reference.

3. Expect New Forms of Peer Review
Consider how you might build new layers of review and validation, from AI-curated bibliographies to collaborative fact-checking. Accuracy is no longer a given; it’s the result of a systematic process.

4. Context Is King, But Challenge the Perspective
When engaging with reports or strategic recommendations, force yourself and your teams to reflect: Does this fit with what we know? Is it supported by real events, verifiable data, actionable insight, and tangible expertise?

5. Upgrade Digital Literacy
Equip your organisation not only to use AI, but to read and critique it. Encourage curiosity about techniques, models, and limits. Make this more mature approach to digital literacy part of your leadership style.

So, Did You Use AI For This?

The question, “how much of that did you write?” belongs to a simpler era that has gone forever. Today, the more relevant challenge is to interrogate what you’re reading for provenance, accuracy, and context, regardless of the blend of human and AI effort involved. Our responsibility is to move away from scepticism and towards discernment, learning to read critically and constructively in an age where intelligence is synthetic, collaborative, and ambiguous by design.

So, next time you scan an email, report, newsletter, or strategic plan, pause for a moment. Not to speculate on the authorship, but to judge the substance. That’s the leadership skill we need most in the AI era.

And yes, you should assume that I used AI tools to help me with this article. Your task is to decide how well you think I used it.