Every marketing team I talk to uses AI. That's no longer interesting. What's interesting is how differently they use it.
For some, AI means "summarise this report" or "write me five headlines." For others, it means stress testing a go-to-market strategy before committing a budget, simulating their toughest board questions, or building live intelligence from thousands of customer data points. And because AI capabilities are developing so fast, that gap builds. The people already using it well keep finding new ways to pull ahead. The people using it as a typing assistant keep falling further behind. Most teams don't realise which group they're in.
Why this gap is so hard to spot
Two people in your standup both say "I used AI for that." One asked it to tidy up a draft, the other used it to pressure-test the entire brief before presenting it. Both look like they're keeping up. Both would tell you they use AI daily. Traditional skills gaps are visible. Someone can't use the CRM, you know. Someone doesn't understand attribution, it shows in the numbers. AI fluency is different. You can't see it unless you know what to look for.
Why AI training doesn't work for marketing teams
We tried workshops, prompt libraries, lunch-and-learns. The curious people got better. Everyone else went back to "summarise this document." The people who got better had one thing in common, they were willing to be bad at it first. They broke things on their own time, tried stupid ideas, failed before anything clicked. You can't train curiosity. What you can do is build systems that make the gap visible and reward the people already closing it on their own.
AI for marketers beyond writing copy
Search "how to use AI in marketing" and you get the same list -write posts, generate headlines, draft emails. Useful, but that's where most people stop.
What changes how teams work
Stress-test your strategy before you spend the budget
I started feeding our campaign plans to AI and telling it to destroy them. Where's the weakest assumption? Where does this break? What would a competitor do with this? We caught problems we would have spent money discovering.
Practise your hardest meeting
Before a board presentation, I have AI play CFO. It challenges my numbers. Pokes holes in my growth assumptions. If I can't hold up against that, I'm not ready for the real room.
But the real shift happens when you stop giving AI one task and start combining layers. Feed it your attribution data alongside your campaign briefs and your localisation performance. Ask it where your best-performing creative in Germany is underperforming in France and why. It won't always be right, but it will find patterns across datasets that no one on your team has the time to cross-reference manually.
- Turn your customer feedback into a live strategy document. Feed AI every customer review, support ticket, NPS response, and social mention from the last six months. Ask it to cluster by theme, track sentiment shifts, and surface the opportunities your team is missing. Update it monthly. Most teams are sitting on thousands of data points they've never properly read.
- Connect it to your actual business. ChatGPT in a browser is fine for general questions. AI that can see your actual CRM, your campaign data, your real numbers? That's a different conversation.
The AI blind spot for marketing leaders
If you're leading a marketing team and you're not AI-fluent yourself, you have a problem you can't see. You can't tell the difference between someone using AI well and someone producing polished mediocrity at twice the speed. The output looks good either way. That's the whole point; your team's output changes, so does what you need to be able to evaluate. You can't teach someone to be curious about AI, but you can build an environment where fluency surfaces naturally.
Here's what actually helps
Set a rule
Nothing gets presented that hasn't been pressure-tested by AI first; not the strategy, not the brief, not the board deck. If someone walks into a review and hasn't run their thinking through AI as a sceptical CFO, a competitor, and a difficult customer, send them back. It takes ten minutes. There's no excuse for walking in with untested assumptions anymore.
Make the process visible, not just the output
When someone presents work that's clearly stronger, don't just say "nice deck", ask them to show the team how they got there. What did they feed into AI? What prompts did they use? What did they change after? The fastest way to close the fluency gap is letting your best people show their working. One five-minute demo in a team meeting does more than a training programme.
Screen for it when you hire
Everyone says they use AI. Give them a live task in the interview, a real brief, 30 minutes any AI tool they want. You'll see immediately who knows how to think with it and who just knows how to type into it.
And if you're a leader, go first. You can't assess what you don't understand.
Where to start this week
If you've mostly used AI for writing, here's what I'd open today:
- For strategy stress-testing: Claude or ChatGPT Pro. Give it your full campaign brief and say "act as my toughest board member." Both are strong here. Claude tends to be more structured in its pushback. ChatGPT is faster for quick back-and-forth.
- For cross-data analysis: ChatGPT with Advanced Data Analysis (upload your CSVs directly) or Claude with file uploads. Feed it your campaign performance data next to your customer feedback. Ask it to find what you're missing. No code needed.
- For competitive intelligence: Perplexity. It searches live sources and cites them. Ask it what your competitors launched this quarter and how the market responded. Better than a Google rabbit hole.
- For connecting AI to your actual systems: Look at Zapier AI or n8n to connect your CRM, ad platforms, and support tools to an AI layer. This is where it stops being a chatbot and starts being infrastructure.
These recommendations will change in three months. The point isn't the specific tools, it's building the habit of reaching for AI before you reach for a meeting.
The gap of AI use, between the team stress-testing strategy and the team writing headlines, doesn't close on its own, it compounds quietly until it shows up in the only place that matters: the numbers. Same market, same budget, different speed. You now know what the gap looks like and where to find it. The only question is whether you'll go look.