Last October, I almost lost a client because I got lazy. I was working with a Series B cybersecurity firm—let’s call them Vespera because I still want to be able to get a drink with their CTO without it being awkward—and I had to pull together a deep-dive on SOC-2 compliance for their blog. I was tired. I had three other projects hitting deadlines at the same time. So, I opened ChatGPT, fed it some basic notes, and asked it to write the technical requirements section. It looked perfect. It sounded authoritative. It used all the right acronyms. But it also suggested a ‘SOC-3 Type Alpha’ certification as a prerequisite. If you know anything about compliance, you know that doesn’t exist. It’s a complete hallucination. I didn’t catch it. The CTO did. He didn’t just correct me; he looked at me like I was a charlatan who didn’t understand the very industry I was being paid to represent. I felt like a total fraud. It took three months of pro-bono work just to get back into his good graces. That’s the price of ‘efficiency’ when the stakes are actually high.
The part nobody talks about regarding ‘authority’
Everyone says AI content is ‘good enough’ for SEO. Maybe it is if you’re selling rubber spatulas or writing about how to tie a tie. But for high-stakes content strategy—the kind where a single whitepaper is supposed to justify a $500,000 enterprise software purchase—’good enough’ is actually a massive liability. AI is essentially a high-speed mimic. It’s like a cheap suit; it looks okay from across the room, but the moment you get close, you see the frayed stitching and the plastic buttons. What I mean is—actually, let me put it differently. AI provides the average of what already exists. If your strategy is to be exactly as smart as the average of the internet, you’ve already lost. You’re just adding to the noise.
Real subject matter experts (SMEs) don’t just know facts. They know the scars. They know the three times a specific implementation failed and why the official documentation is lying to you. AI can’t do that because it hasn’t lived it. It hasn’t had a server room melt down at 3 AM on a holiday weekend.
I know people will disagree with this, but I honestly think most ‘Content Strategists’ today are just expensive librarians. They organize information but they don’t actually understand the mechanics of the business they’re in. They’re too busy looking at SEMrush keywords to realize that the keyword they’re chasing is being searched by people who have zero budget. It’s a waste of time.
I tracked the numbers and they’re depressing

I’m a bit of a nerd about tracking performance, so I ran an experiment over the first six months of this year. I managed two different content tracks for a fintech project. Track A was 80% AI-generated with human editing (about 15 minutes of polish per piece). Track B was 100% human-led, involving hour-long interviews with actual tax attorneys. We published 14 articles in each track.
- Track A (AI-assisted): 12,400 visitors, 0.02% conversion rate to demo requests.
- Track B (Human SME): 8,200 visitors, 4.1% conversion rate to demo requests.
- The Cost: Track B cost 4x more per article.
- The Result: Track B generated $112,000 in attributed pipeline. Track A generated zero.
Total waste of money. People can smell the lack of soul in AI writing. They might click on a catchy headline, but the moment they realize they’re reading a rehashed Wikipedia entry, they bounce. They don’t trust you. And in high-stakes B2B, trust is the only currency that actually matters. If you aren’t building trust, you’re just spending money to annoy people.
The ‘Prompt Engineering’ lie
I used to think that maybe we just weren’t prompting well enough. I spent weeks buying ‘masterclasses’ and learning how to tell the AI to ‘act as a senior systems architect.’ I was completely wrong. You can’t prompt your way into lived experience. It’s like trying to describe the taste of a peach to someone who has never eaten fruit. You can get close with adjectives, but you’ll never capture the sticky juice running down your chin. (Anyway, I digress. I spent three hours yesterday looking at the specific weight of different espresso tampers because I’m convinced my 58mm steel one is too light, which is exactly the kind of obsessive rabbit hole an AI would never understand.)
AI is a calculator for words. It can give you the sum, but it can’t tell you if the math was worth doing in the first place.
I refuse to work with anyone who uses HubSpot’s default AI templates for their strategy. I’m serious. I actively tell my friends to avoid agencies that brag about their ‘AI-driven workflow.’ It shows a fundamental lack of taste. It tells me you care more about your margins than my results. It’s lazy, and frankly, it’s insulting to the audience. I’ve bought the same $140 fountain pen three times because I like the way the nib feels on specific paper; that kind of irrational, specific preference is what makes human content interesting. AI is incapable of being irrational. It’s always ‘logical,’ which makes it incredibly boring.
Where humans actually win (and why it’s hard)
Human experts are messy. They’re hard to schedule. They use jargon that you have to ask them to explain three times. They have hot takes that might offend half your audience. But that’s exactly why they work. A real expert is like a jazz musician playing a wrong note on purpose—it creates tension and interest. AI only knows how to play the notes exactly as they’re written on the sheet music. It’s technically correct and emotionally dead.
I remember interviewing a logistics director for a piece on supply chain resilience. He told me that the most important part of his job wasn’t the software—it was knowing which port authority officials liked which brand of scotch. You will never, ever get that insight from a large language model. That’s the ‘unspoken’ layer of business. If your content strategy doesn’t tap into that, you’re just writing for robots. And robots don’t have credit cards.
It’s hard to find these people. It’s even harder to get them to sit down and talk to you. But if you aren’t doing the work to extract that unique knowledge, you aren’t doing content strategy. You’re just doing data entry.
That’s the whole trick.
I’m still not sure if I’ll ever fully trust my own workflow again after that Vespera incident. Every time I see a perfectly formatted paragraph now, I get a little twitch in my eye. I wonder if I’m becoming obsolete, or if I’m just becoming more aware of how much garbage we’ve all been willing to accept as ‘content.’ Maybe the future isn’t about being faster. Maybe it’s just about being more human, even if that means being slower and more expensive. Is that a sustainable business model? I honestly don’t know.
Stop trying to scale things that shouldn’t be scaled.
