Do Meta Descriptions Still Matter? SEOs Debate Google’s Role in Search Snippets
A bold SEO test claims removing meta descriptions boosts traffic, while experts argue it's all performance theater.
4 min readHighlights
SEO test claims 3% traffic boost on pages with no meta descriptions.
Google rewrites most meta descriptions anyway, says test creator.
Critics argue such tests ignore real SEO priorities like content and UX.

Image Source: Designed by Martech Scholars using Canva Pro – Visualizing the SEO debate over meta description relevance.
Do Meta Descriptions Still Matter? SEOs Debate Google’s Role in Search Snippets
A surprising claim from a well-known SEO expert has reignited debate around a long-held pillar of search optimization: the humble meta description.
Mark Williams-Cook, SEO veteran and founder of SEOTesting, published results from a live test suggesting that pages with no meta descriptions actually saw increased traffic compared to those with custom-written descriptions. While the increase was modest—around 3%—the implication is provocative: could writing meta descriptions actually be a waste of time?
But not everyone agrees. In the same week, another respected SEO, Jono Alderson, called SEO tests like this one “performative theater” that distracts marketers from meaningful improvements.
This clash of perspectives highlights a bigger issue in the world of SEO: what changes truly matter—and which ones just feel like progress?
The SEO Test That Sparked the Debate
In a LinkedIn post, Mark Williams-Cook broke down findings from a series of A/B tests comparing traffic on pages with and without meta descriptions.
“We have consistently found a small, usually around 3%, but statistically significant uplift to organic traffic on groups of pages with no meta descriptions vs test groups of pages with meta descriptions,” he wrote.
The test, conducted using his SEOTesting platform, reportedly involved live websites and control groups. His takeaway?
“If you’re writing meta descriptions manually, you’re wasting time. If you’re using AI to do it, you’re probably wasting a small amount of time.”
According to Williams-Cook, Google rewrites around 80% of meta descriptions anyway. In his view, letting Google generate its own query-specific snippets can actually improve click-through rates—because they’re dynamically tailored to each search intent.
SEO Tests = “Performative Theater”?
Coincidentally, Jono Alderson, an experienced technical SEO and strategist, released a sharp critique of SEO testing just days earlier. In his essay titled “Stop Testing. Start Shipping.”, he argues that many SEO experiments—especially those trying to isolate tiny on-page elements like meta descriptions—are built on flawed assumptions.
“SEO isn’t a closed system,” he writes. “Trying to isolate the impact of a single change in that chaos isn’t scientific. It’s theatre.”
Alderson’s argument is grounded in reality: websites don’t operate in a lab. They’re affected by seasonality, competition, UX updates, Google core algorithm changes, and even “the weather,” as he jokes. You can’t control for all variables, so trying to pinpoint the exact cause of a traffic bump (or drop) is misleading.
He also warns that obsessing over testable tweaks like meta tags distracts SEOs from more meaningful work, like improving user experience, content quality, and site structure—elements that are harder to test but far more impactful long-term.
So, Who’s Right?
Both Williams-Cook and Alderson make compelling points, and the truth may lie somewhere in between.
On the one hand, it’s reasonable to question the value of spending hours crafting meta descriptions—especially when Google often rewrites them. If that time could be better spent improving content, maybe omitting them is a smart trade-off.
On the other hand, Alderson’s criticism of SEO testing as a discipline isn’t without merit. Most A/B tests in SEO are run in open systems with numerous uncontrollable variables. Even if results are “statistically significant,” it’s difficult to establish clear causality.
For instance, was the 3% lift due to Google snippets, or simply because the test group contained better-performing pages? Were there seasonal changes or competitive shifts in play? These are hard questions to answer definitively.
What Google Actually Does With Meta Descriptions
According to Google’s own documentation, meta descriptions are not a ranking factor—but they do influence click-through rates by serving as a preview of the content.
In 2020, Google revealed that it rewrites snippets over 70% of the time—and that number has likely increased with advancements in AI and understanding of search intent.
So, Williams-Cook’s argument that Google can generate better, more relevant snippets than a static meta tag is grounded in Google’s behavior. His test seems to confirm that Google might even prefer doing it themselves.
The Subjectivity of SEO
Ultimately, the decision to use meta descriptions—like most things in SEO—depends on your goals, site structure, and team resources.
- Enterprise Sites may benefit from automated meta descriptions using dynamic content or structured data.
- Small Blogs or Niche Sites might benefit from optimized descriptions that highlight key value props.
- News Sites, where freshness and intent vary daily, might lean on Google to craft query-specific snippets.
Williams-Cook’s approach may work best in environments where scale and efficiency matter more than micro-optimizations.
What Should Marketers Focus On Instead?
If the debate proves anything, it’s that micro-optimizations shouldn’t overshadow macro strategy. Instead of worrying about whether a meta description boosts CTR by 1–3%, SEOs might consider focusing on:
- Improving page speed and core web vitals
- Developing E-E-A-T compliant content (Experience, Expertise, Authoritativeness, Trust)
- Enhancing topical authority with internal linking
- Matching search intent with user-friendly design
That doesn’t mean testing is useless—but it should serve a bigger picture, not just check a box.
Final Thought: Intent Over Optimization
SEO isn’t a lab experiment—it’s a living ecosystem. That makes clean tests difficult and universal answers rare.
Williams-Cook may be right that Google is better at crafting snippets than most marketers. Alderson may be right that testing single elements won’t get you far in a dynamic search world.
Both perspectives reveal an important truth: intent and usefulness matter more than perfect metadata.
So before you delete every meta description on your site—or obsess over A/B tests—ask yourself:
Is this decision helping users, or just making me feel productive?
Because that answer might be more valuable than any click-through rate percentage.