How AI can help, or hurt, authenticity in content
Careful use of AI can certainly enhance a creator's voice, but using AI the lazy way will sacrifice authenticity.
A few weeks ago, I tried to define what makes content authentic. I identified five pillars: content must be human, honest, have character, be original, and transparent. But now comes the question: Does using AI in the content process risk authenticity? Or enhance it? The answer might be both.
AI can add efficiency in creation and breadth in the brainstorming and research phase. But it can destroy originality and transparency if not kept in check. Here’s how I see AI aligning with each pillar of authenticity.
1. Authentic content is human (AI is not)
In my last post, I discussed how authentic content comes from an initial curiosity or observation that sets everything in motion. It’s borne of the creator, so adding AI to this stage should be done delicately.
Where AI helps: Think of AI as your sounding board. When developing an idea, talk it through with AI, using audio chat, for instance. I think of it as an enhancement to your inner monologue. Ask the chatbot to ask questions about your idea to develop the layers. Ask it to find supporting context. It's quicker than staring at a blank page and more interactive than traditional research.
Where AI hurts: Don’t lose your agency. AI chatbots will help you explore your idea, but they tend to tell you how wonderful it is. Having an AI yes-man might feel good. There’s a boost when someone reinforces your ideas. But don’t expect it to call you out. That means you might not know when your idea is going down a predictable road.
For now, ask the tough questions yourself. You can lead the AI by asking questions like, “Isn’t this starting to sound cliché?” In those cases, I’ve gotten helpful feedback from the machine that led me to discover a better, more novel approach to the project.
Of course, if you’re just asking AI for ideas and not making them yours, you’re outsourcing your creativity to the bots.
2. Authentic content is honest (AI can keep you honest)
Honest content is true to the idea, intent, and mostly to its creator. It’s not surface-level. Honest content has gone through its paces.
Where AI helps: Creators stay honest by questioning their content, and AI excels at that. Use it for fact-checking and identifying inconsistencies. Feed it your draft and ask: "What arguments are weak? Where am I contradicting myself?"
AI can flag unintentional bias—places where your perspective might limit the analysis. Used this way, AI can strengthen honesty instead of undermining it. The important factor is you’re directing the AI, not the other way around.
Where AI hurts: If you just ask an AI to write a generic post with no source material, then publish it under your name, that’s dishonest. There’s very little of the creator in that. That’s AI slop.
Handing over full creative control to AI may address a topic, but it lacks genuine perspective from the creator. You could argue the model is trained on millions of human insights and perhaps aligns with how a human might approach that topic. But good content is more than a work that plausibly reflects the ideas of some unknown cohort of humanity. It represents its creator’s thinking in a way that adds something unique. And even if you argue that novel ideas are harder to find, there’s authenticity in a creator presenting something that’s new to them.
3. Authentic content has character (Which AI can learn)
Character gives content its voice, the combination of viewpoint, style choices, and personality that makes a creator’s content recognizable.
Where AI helps: Don’t believe everything you produce initially is perfect. Few first drafts or raw cuts are. Creating a recognizable character and voice comes from an editing and refinement process that turns those drafts into polished works.
AI is pretty good at that. Training models on your style, voice, and visual identity isn’t difficult if you write good prompts. Training AI tools on those qualities can speed up the refinement process. You still create the qualities that give work character, but the AI serves as a guardrail.
For example, I’ve trained a Claude Project on how I structure these Substack posts and a style prompt on all of the things that comprise my idea of how my writing should sound. Then, I record myself talking about the ideas for an article. The AI excels at taking that transcript and molding it into a first draft, starting me farther down the editing process.
Where AI hurts: Even if your style is unique, repeating it will eventually bore an audience. Outsourcing creation to an AI, even one trained on your approach, can get stale. Just as a creator’s approach changes over time, program those changes into your tools.
4. Authentic content is original (AIs are trained on existing data)
Original content approaches topics in fresh ways, even if the subject isn't new.
Where AI helps: AI can quickly output based on a wild idea. For example, prompt your favorite LLM: “Give me a 10-episode synopsis of a podcast series about a human who marries an alien and moves across the universe to live on that alien’s planet?” You’ll get what you ask for, and some ideas might sound interesting.
If that was your project, getting that quick output might spark your original ideas for the series. You might borrow and amend some ideas, and reject most. The point is that efficient delivery of quick options fired up your thinking about the premise.
Is pitching this idea to a room of brilliant human writers, who still may suggest played out clichés, that different? The risks of using AI for ideation are similar to those with human sounding boards.
Where AI hurts: AI is trained on past data, likely regurgitating older ideas as new. These models aren’t really “thinking.” It’s still language and probability that determine AI output.
Use AI to develop, tease, and challenge your ideas, but be skeptical of its output. Make it yours before presenting it to an audience.
5. Authentic content has transparency (It’s a human decision)
Transparent content makes its origins, influences, and intentions clear to the audience.
Where AI helps: AI tools can track sources, identify conflicts of interest, and ensure proper credit for inspirations.
I often drop my writing into ChatGPT and ask it to run a Deep Research that looks for sources that support or challenge the ideas represented in the work. I use it to find context to support conclusions, but it may surface opportunities to credit those with similar ideas.
Where AI hurts: The transparency challenge with AI isn’t technical, it’s ethical. What requires transparency? If your content is AI-generated with minimal input, be transparent. However, that content shouldn’t be classified as authentic. It may serve a purpose. Definitions, instructions, and other informational/transactional info could be helpful to an audience, even if generated automatically. But that’s not what we are talking about here.
The question is this: Assuming you’ve met all other content authenticity pillars, what transparency is required? You don’t tell a reader that you searched Google or talked to a team of writers to flesh out an idea unless there’s something to attribute.
But it’s different with AI, because unlike other research and brainstorming methods, AI can create the work for you. Here’s how I’d suggest handling it based on the level of AI automation used:
Automated creation: Should disclose AI use regardless. This is not authentic content.
Partially automated creation: Disclosure by discretion. AI was used after being trained in your voice, style, and character. You likely provided source material and refined and finalized the work yourself. I’d disclose, but only if the final work falls short of what you might create without AI.
AI for research and ideation or simple copy editing: No disclosure required.
The path forward
AI isn't going away, and neither is our need for authentic content. The question isn't whether to use AI, but how to use it while maintaining the human elements that make content worth creating and consuming.
Use AI as a tool, not a replacement. Set boundaries. Challenge it to challenge you. Take what's useful, discard what's generic. Most importantly, remember that the initial human curiosity can't be outsourced.
The tools will improve, and the temptation to let them do more will grow. But authentic content will always require a human willing to think, create, and take responsibility for their ideas.
What's your experience? How are you using AI while keeping your content authentic? Let me know in the comments or shoot me a note.
Music notes
This week, I’m sharing a song by SOPHIA ISELLA, an incredibly authentic artist. Her music reminds me of Nine Inch Nails, PJ Harvey, and Mazzy Star, but only in terms of influence. In fact, she lists Trent Reznor as an influence.
I must have triggered something on Instagram because one day my feed was filled with clips of her singing “Doll People,” which I’m sharing below. Her fame is rising. She even opened for Taylor Swift at one of the pop star’s London shows last summer. She just announced her first tour and recently played Bonnaroo.
Here’s Doll People. It’s haunting, piercing, and so very authentic.
2nd time I’ve seen the word agency used in context of using AI. One negative,
One positive.