On this page
We cancelled Ahrefs last week. The decision took about ten minutes once we walked through what the tool actually does versus what we had been paying for it to do.
This is not a complaint about Ahrefs specifically. It is a structural argument about where SEO data comes from and why that source is about to matter more than it ever has.
What the Tool Was Actually Doing
Ahrefs, like every third-party SEO platform, does not have direct access to Google's data. It aggregates signals from users who have connected their Google Search Console accounts to the platform. Keyword rankings, clicks, and impressions all come from a pooled sample of other people's sites. The tool estimates your visibility based on what it can infer from adjacent sites in adjacent industries.
That is not a secret. It is how all of them work. The problem is that most users treat the output as if it were precise. We did this too, for longer than we should admit.
The navigation is also hostile to the kind of thinking modern SEO requires. Finding which keywords a page ranks for takes several clicks. Determining whether those keywords match your brand voice requires judgment the tool cannot provide. Identifying what content to create from a gap analysis is still manual work for every single keyword in the list. Every page load depletes your subscription quota.
At €1,000 per year, the math stopped making sense.
The Signal That Matters Is the One You Own
Google Search Console tells you exactly what your site ranks for: clicks, impressions, positions, click-through rates. No intermediary. It is the primary source. The third-party tools are downstream of it.
Third-Party Tools
- Aggregate signals from other people's GSC accounts
- Sampled estimates, not your actual data
- Quota-burning navigation to reach each insight
- Downstream of the primary source
Wire + Search Console Direct
- Your actual clicks, impressions, and positions
- Direct Google API read, no intermediary
- Local snapshot, zero quota, instant access
- Acts on what Google sees, not what it infers
Wire's data pipeline reads directly from Search Console through the Google API and stores the results locally:
python -m wire.chief data
The SEO automation pipeline pulls keyword-level data for every page, detects which pages are competing against each other for the same terms, scores gaps, and queues content decisions. No subscription. No sampled estimates. No quota burning on navigation.
The database knows what Google sees. Wire acts on it. The workflow sequence is designed so that every subsequent command (audit, deduplicate, refine) works from that local snapshot.
Conversational Search Breaks the Keyword Model
The bigger issue is not the cost. It is that the model these tools were built for is changing.
Classic SEO was built on short, explicit keyword queries: "IDP software", "invoice automation", "document processing vendor". Optimize a page for the term, earn the ranking, capture the click. Tools like Ahrefs were designed to track this. The keyword-as-signal model made sense.
That model is eroding. Queries are getting longer and more contextual: "Which document processing platform handles German invoices with SAP integration in a mid-size manufacturing company?" The search system is no longer matching keywords to pages. It is inferring intent, organizational context, user history, and location. An individual searching in Wetzlar, Hessen for a Siemens appliance is not issuing the same query as someone in Berlin looking for the same product, even if the literal words are identical.
Google's own research on search intent describes this shift toward understanding context rather than matching strings. Third-party tools measure the old signal. They track keyword density and backlink counts against a model of search that was already partially obsolete in 2023 and is now genuinely misaligned with how queries resolve. The SEO market has not fully absorbed this yet, which is why tools that charge four figures annually are still selling confidently.
The Black Box Problem Nobody Wants to Name
There is a second problem, less discussed and more serious.
The numbers already make clear what direction this is heading. Ahrefs re-ran their AI Overview study in December 2025 and found that the presence of an AI Overview now correlates with a 58% lower average clickthrough rate for the top-ranking page. Their framing is direct: "For every 100 clicks you could historically earn for a top-ranking page, Google now 'keeps' 58." SparkToro's parallel research found that 60% of all Google searches already ended without a click in 2024, before AI Mode accelerated the trend further.
When a user submits a conversational query to an AI-powered search surface (Google's AI Overview, a Perplexity result, a Bing Copilot answer), the clicks and impressions that occur inside that conversation are invisible to any external tool. Only the platform itself knows what happened.
Search impressions provide more insights than any external tools.
- Search has near zero clicks now.
- Tolls will be missing the signals.
- Own your data and refine your content.
This creates a structural blindspot. Your content may be influencing AI-generated answers at scale, driving awareness and intent, without a single click registering in your Search Console data. Or it may not be cited at all. You cannot tell from the outside.
The implication is that content quality and citation-worthiness are now more important than keyword density. An AI summarizing a topic will pull from pages that make factual, well-structured, clearly sourced claims. Not from pages optimized for a keyword that no longer functions as the unit of search.
Wire's quality gates are built around this. The fail-loud build system refuses pages that are too thin, too similar to existing pages, or too poorly structured to survive scrutiny. Not because those pages would fail a keyword audit. Because they would fail a model evaluating whether to cite them.
What We Built Instead
The tool we replaced Ahrefs with is called Chief. It runs as a command-line interface. It runs on your AI subscription, using minimal tokens per operation.
Chief is a rules-based content pipeline. It does not estimate. It does not infer from third-party aggregates. It reads your actual search data, applies documented thresholds, and produces decisions that are reproducible and auditable. A full audit takes under two minutes per site.
The news intelligence pipeline monitors what is being written in your industry, integrates relevant developments into existing pages, and queues content that fills documented gaps. A branded, fact-checked article can go from queue to published in under fifteen minutes.
That is not a sales claim. It is the workflow we ran on our own sites before we offered it to clients.
The Year the Tools Fell Behind
2026 is the year it becomes obvious that the SEO tool category is structurally misaligned with the search environment it claims to measure.
The tools were built for a world where search was deterministic, keyword-driven, and fully logged through click data. That world is not gone. It still governs the majority of organic traffic today. But the margin is narrowing every quarter, and the tools have no credible roadmap for the conversational layer they cannot see.
Cancelling a subscription is a small decision. Recognizing what it represents is not.
The pipeline that replaced it does not track what the old world cared about. It builds for the one that is arriving.
What practitioners are saying
These are not Wire's words. These are direct quotes from SEO practitioners on Reddit, discussing the same problems we built Wire to solve.
I find having bad data is worse than having no data. Bad data can lead to the wrong conclusions, and those conclusions are confident. Meaning you think they're right, and that's worse than knowing that you might be wrong.
u/mafost-matt on r/SEO, 216 upvotes. The post "Bye Semrush. After 8 years, cutting the cord." drew 291 comments from SEOs abandoning their subscriptions.
Paying hundreds a month for data that's off by 40% is wild. At that price point you'd expect at least consistent enough numbers to make actual decisions.
u/Jsaldleaf, replying to the same thread.
They went public. It's no longer about being useful, it's about being profitable.
u/IgorAMG, about Semrush's trajectory after IPO.
My competitor said: "You're not my competitor, your website has only 150 organic visitors according to Semrush and Ahrefs" but my website has 10K organic visitors. Don't trust them.
u/togi1202 on r/SEO. Third-party estimates versus reality.
We used to make audits for prospective clients using semrush and ahrefs but the data flow is so bad now and we've had multiple prospects say this in the meetings as well that the numbers are wildly inaccurate. Stopped using both and now we ask for GA GSC instead right off the bat.
u/noxnox12 on r/SEO. Agencies dropping third-party tools for first-party data.
The one and only thing you should take from this is that your pricing is stupid high.
u/CaptianTumbleweed, replying directly to Ahrefs CEO Tim Soulo in his biannual r/bigseo feedback thread. 8 upvotes.
The shift is not coming. The shift already happened. The practitioners who manage real sites moved to first-party data. Wire reads your Google Search Console data and acts on it. No intermediary. No estimates. No subscription that profits from showing you problems without fixing them.