1. SEJ
  2.  ⋅ 
  3. SEO

Mueller: Use Domain-Level Disavow, Not URL-by-URL

Google's John Mueller recommends domain-level and even TLD-level disavow for spammy backlinks, adding that those links are unlikely to cause ranking issues.

Google’s John Mueller just told an SEO to disavow entire TLDs at once — and then questioned whether it even matters.

Google’s John Mueller told a site owner on Reddit to stop disavowing spammy backlinks one URL at a time. Use domain-level entries instead, he said — or even disavow an entire top-level domain in one line.

Then he added the part that matters more: he’d “be surprised” if those links were actually causing ranking problems.

The comment on r/bigseo is the latest in a pattern of Mueller steering practitioners away from link-blame as a default diagnosis. For anyone spending hours curating disavow files or paying for toxic-link audits, it’s worth reading closely.

The Full Quote and Why the Last Sentence Matters

The original poster asked about limits on removing spammy backlinks. Mueller’s reply was brief:

“I’d prioritize and use domain level dismemberment. You can even do it by top level domain if you see they’re all from the same TLDs. I would be surprised if they’re the cause of issues though.”

John Mueller, Google Search Relations (via Reddit)

Two things are happening here. The first is a practical workflow recommendation: skip URL-by-URL disavow and work at the domain or TLD level. The second is a direct challenge to the premise of the question. Mueller is saying the spammy links probably aren’t causing the ranking issues at all.

How Domain-Level and TLD-Level Disavow Works

Google’s disavow tool documentation supports a domain: prefix syntax. Instead of listing dozens or hundreds of individual URLs from a single spam domain, you add one line: domain:example.com. That entry covers every URL on that domain.

TLD-level disavow (for example, domain:.xyz) is not explicitly documented in Google’s help pages, but Mueller has now recommended it on multiple occasions. In March 2026, he gave similar guidance to a Bluesky user worried about 50+ spam links per week from .xyz domains.

The practical difference is significant. A site dealing with thousands of spammy links from a handful of TLDs can reduce a sprawling disavow file to a few lines. That cuts maintenance time and removes the need to constantly update the file as new spam URLs appear from the same sources.

A Pattern, Not a One-Off

Mueller has been delivering the same message through multiple channels in early 2026. The Reddit comment is consistent with at least two other recent statements.

In April, Search Engine Roundtable reported that Mueller reconfirmed Google may simply ignore outbound links from sites that violate spam policies. Separately, Mueller addressed whether outbound links pass “poor signals” at all.

The March 2026 spam update completed in under 20 hours and was widely perceived as muted. That speed, combined with Mueller’s repeated messaging, points in one direction: Google’s algorithmic handling of link spam has matured to the point where most spammy links are neutralized before they ever affect rankings.

If that’s the case, the disavow tool is a safety net for edge cases, not a primary ranking lever.

What This Changes for Disavow Workflows

  • Switch existing disavow files from URL-level entries to domain-level (<code>domain:example.com</code>) entries. Where spam clusters share a common TLD, use TLD-level disavow (<code>domain:.xyz</code>). This is the specific workflow Mueller is recommending, and it reduces file maintenance from an ongoing chore to a one-time setup.
  • If you’re diagnosing a ranking drop, don’t start with the backlink profile. Mueller has now said in multiple public forums that spammy inbound links are unlikely to be the cause. Start with Search Console’s manual actions report. If there’s no manual action, the links are almost certainly not the problem.
  • For agency account managers running QBRs: Mueller’s “I would be surprised” quote is a useful reference when clients arrive with third-party toxic-link reports. The quote is specific, recent, and from Google’s own search relations team.

One Gap in the Documentation

TLD-level disavow is not documented in Google’s official help pages, even though Mueller has now recommended it publicly more than once. Whether Google updates the documentation to reflect this, or whether it remains informal guidance, is worth watching.

For now, practitioners who use it are relying on Mueller’s word rather than official documentation. That’s a distinction worth noting, especially for enterprise teams that need to justify tooling decisions to stakeholders.


AI-generated first-pass scaffolding. This draft was produced by Search Engine Journal’s newsroom automation as a starting point for a writer. Rewrite before publishing.


Research notes (review and remove before publishing)

The bot collected this context while writing. Skim, verify, then delete this whole section before publish.

Headline alternatives

  1. Mueller: Use Domain-Level Disavow, Not URL-by-URL
  2. Why Spammy Backlinks Probably Aren’t Hurting You
  3. Mueller’s Disavow Advice Challenges Link-Audit Industry

Primary sources cited

Suggested internal links (prior SEJ coverage)

Competitor coverage seen

Practitioner pulse

No meaningful social discussion found yet — story is only ~2 hours old and originated as a casual Saturday Reddit reply. Exa LinkedIn search was degraded.

Background

Mueller has been on a sustained campaign throughout early 2026 to recalibrate practitioner expectations around the disavow tool. In March 2026, he told a Bluesky user worried about 50+ spam links per week from .xyz TLDs that they could disavow at the TLD level and that the links were unlikely to cause harm (searchenginejournal.com). That advice built on Google’s official disavow documentation, which already supports the `domain:` prefix syntax but does not explicitly document TLD-level disavow (support.google.com). The March 2026 spam update completed in under 20 hours and was widely perceived as muted, yet SEJ analysis suggested it may signal deeper algorithmic changes to how Google handles link spam at scale (searchenginejournal.com). Separately, SERoundtable reported in April 2026 that Mueller reconfirmed Google may simply ignore links from sites that violate spam policies, further diminishing the case for aggressive disavow workflows (seroundtable.com).

Open questions for follow-up coverage

  • Is TLD-level disavow (`domain:.xyz`) officially supported in the disavow file spec, or is it an undocumented behavior Mueller has confirmed informally? Google’s help docs don’t mention it.
  • Does SpamBrain’s current iteration effectively neutralize all negative-SEO-style link attacks, or are there edge cases (e.g., YMYL niches, thin-content sites) where spammy links still correlate with ranking drops?
  • Has Google updated the disavow tool’s processing limits recently? The original Reddit thread title references a ‘removal limit’ — worth checking if there’s a cap on disavow file size or entry count.
  • Would Mueller or Google’s Search Relations team provide a more formal statement on the diminishing relevance of the disavow tool for a follow-up piece?

⚠ Unknown-tier sources surfaced (vet before quoting)

Image search query

“person reviewing website backlinks on laptop”

Flags

dateline=fresh · degraded research: exa.linkedin, gemini.bg, preflight (wp_401), cse (no relevant LinkedIn/X practitioner posts returned)

Fact-check flags

  • ◐ MED — “domain level dismemberment” — The draft’s blockquote correctly reproduces Mueller’s word ‘dismemberment,’ but the surrounding prose paraphrases it as ‘disavow at the domain level’ — the original quote uses the unusual word ‘dismemberment’ (likely autocorrect for ‘disavowal’), and the draft should note this verbatim oddity rather than silently inter (source: https://www.reddit.com/r/bigseo/comments/1t11ss9/spammy_links_removal_limit_for_seo/ojj7ooh/)
  • ◐ MED — “John Mueller, Google Search Relations team lead” — Mueller’s exact current title is not stated in the Reddit comment or the source article; the research brief uses this title but it originates from the brief itself — verify Mueller’s current official title (he has previously been called Search Advocate, Senior Search Analyst, etc.).
  • ◐ MED — “In March 2026, he gave similar guidance to a Bluesky user worried about 50+ spam links per week from .xyz domains” — The ’50+ spam links per week’ detail and the ‘.xyz domains’ specifics come from the research brief’s background section but the linked SEJ article was not provided in full — verify these specifics against the actual SEJ article before publishing. (source: https://www.searchenginejournal.com/google-says-disavow-links/568928/)
  • ⚠️ HIGH — “The March 2026 spam update completed in under 20 hours” — The ‘under 20 hours’ completion time is stated in the research brief and attributed to SEJ analysis, but no source article text was provided to verify this specific number — this is a hallucination risk and must be confirmed against the actual SEJ article.
  • · low — “In April, Search Engine Roundtable reported that Mueller reconfirmed Google may simply ignore outbound links from sites that violate spam policies” — The research brief dates this to 2026-04-13 and the claim aligns with the brief’s summary, but the full SERoundtable article text was not provided — confirm the article says ‘outbound links’ specifically vs. links more generally. (source: https://www.seroundtable.com/google-may-ignore-links-from-sites-that-spam-41148.html)
  • · low — “TLD-level disavow (for example, domain:.xyz) is not explicitly documented in Google’s help pages” — The research brief corroborates this claim, but the full text of Google’s disavow documentation was not provided — verify against the live help page that TLD-level syntax is indeed absent. (source: https://support.google.com/webmasters/answer/2648487?hl=en)
  • · low — “Mueller has now recommended it publicly more than once” — The draft claims multiple public recommendations of TLD-level disavow; the brief references the Bluesky instance and this Reddit comment — confirm there are at least two documented instances to support ‘more than once.’

Drafter’s writer notes

FACTCHECK_FLAGS_GO_HERE

  • Mueller’s Reddit comment uses the word “dismemberment” (likely autocorrect for “disavow”). The blockquote preserves his exact wording. Writer may want to add a [sic] or note this in the article if it reads oddly.
  • TLD-level disavow (domain:.xyz) is NOT documented in Google’s official disavow help page (support.google.com/webmasters/answer/2648487). Mueller has recommended it informally on Reddit and Bluesky. Worth flagging this gap explicitly, which the article does in the conclusion.
  • Research was partially degraded: LinkedIn search, Gemini background, preflight WP check, and CSE social search all returned limited or no results. No social pulse data was available. Story is ~2 hours old at time of research.
  • Unknown sources from the brief (12amagency.com, inbound-seo.uk, seoatlantic.com, techsavycrew.com) were not used in the article.
  • Potential follow-up: Ask Google’s Search Relations team whether TLD-level disavow is officially supported or an undocumented behavior. Also worth exploring whether the disavow tool has any file-size or entry-count limits, which was the original poster’s question.

Fact-check pass: The ‘under 20 hours’ spam update completion claim is the highest-risk flag — no source text was provided to verify this specific number; Mueller’s exact job title and several secondary details also need confirmation against the actual linked articles.

    • medium — “domain level dismemberment”

The draft’s blockquote correctly reproduces Mueller’s word ‘dismemberment,’ but the surrounding prose paraphrases it as ‘disavow at the domain level’ — the original quote uses the unusual word ‘dismemberment’ (likely autocorrect for ‘disavowal’), and the draft should note this verbatim oddity rather than silently inter Source: https://www.reddit.com/r/bigseo/comments/1t11ss9/spammy_links_removal_limit_for_seo/ojj7ooh/

    • medium — “John Mueller, Google Search Relations team lead”

Mueller’s exact current title is not stated in the Reddit comment or the source article; the research brief uses this title but it originates from the brief itself — verify Mueller’s current official title (he has previously been called Search Advocate, Senior Search Analyst, etc.).

    • medium — “In March 2026, he gave similar guidance to a Bluesky user worried about 50+ spam links per week from .xyz domains”

The ’50+ spam links per week’ detail and the ‘.xyz domains’ specifics come from the research brief’s background section but the linked SEJ article was not provided in full — verify these specifics against the actual SEJ article before publishing. Source: https://www.searchenginejournal.com/google-says-disavow-links/568928/

    • ⚠️ high — “The March 2026 spam update completed in under 20 hours”

The ‘under 20 hours’ completion time is stated in the research brief and attributed to SEJ analysis, but no source article text was provided to verify this specific number — this is a hallucination risk and must be confirmed against the actual SEJ article.

    • · low — “In April, Search Engine Roundtable reported that Mueller reconfirmed Google may simply ignore outbound links from sites that violate spam policies”

The research brief dates this to 2026-04-13 and the claim aligns with the brief’s summary, but the full SERoundtable article text was not provided — confirm the article says ‘outbound links’ specifically vs. links more generally. Source: https://www.seroundtable.com/google-may-ignore-links-from-sites-that-spam-41148.html

    • · low — “TLD-level disavow (for example, domain:.xyz) is not explicitly documented in Google’s help pages”

The research brief corroborates this claim, but the full text of Google’s disavow documentation was not provided — verify against the live help page that TLD-level syntax is indeed absent. Source: https://support.google.com/webmasters/answer/2648487?hl=en

    • · low — “Mueller has now recommended it publicly more than once”

The draft claims multiple public recommendations of TLD-level disavow; the brief references the Bluesky instance and this Reddit comment — confirm there are at least two documented instances to support ‘more than once.’


Editorial self-critique pass: Rewrote dek to hook rather than summarize. Restructured intro into three paragraphs with set-the-table cadence (news → key detail → angle). Renamed generic headings (‘What Mueller Said’ → ‘The Full Quote and Why the Last Sentence Matters’; ‘What To Do Now’ → ‘What This Changes for Disavow Workflows’; ‘The Bottom Line’ → ‘One Gap in the Documentation’). Cut two takeaway bullets that failed the announcement-specificity test (generic CWV/E-E-A-T advice, generic ROI-of-link-tools advice). Tightened remaining bullets to tie directly to Mueller’s statement. Removed double-linking of Reddit source (intro + blockquote). Softened Mueller’s title attribution to ‘Google Search Relations’ to avoid unverified ‘team lead’ claim. Split a Tourettes-strung paragraph in section 3 into two.


Editorial review (applied): Rewrote the dek to hook instead of summarize, restructured the intro into a proper set-the-table cadence, renamed four generic headings to be specific, cut two takeaway bullets that failed the announcement-specificity test, removed a double-link to the Reddit source, and split a Tourettes-strung paragraph in the pattern section.

  • dek at `dek` — The dek is two ideas joined by a comma — it half-summarizes the article instead of hooking; the second clause (‘questions whether those spammy links are really the problem’) gives away the lede rather than planting curiosity.
  • intro at `intro_html` — The intro’s first paragraph stacks three facts (domain-level disavow, ‘be surprised’ quote, Reddit source) in one breath — it’s an information dump rather than a set-the-table cadence of news → telling detail → angle.
  • ⚠️ heading at `outline[0].h2` — ‘What Mueller Said’ is a generic heading that doesn’t sub-divide meaningfully — the entire article is about what Mueller said; framework says headings must promise something specific.
  • paragraph_coherence at `outline[0].paragraphs[2]` — ‘The last sentence is the one that matters most’ is editorializing that tells the reader what to think rather than letting the quote land; it also restates the dek’s second clause.
  • · heading at `outline[3].h2` — ‘What To Do Now’ is a generic template heading; framework says to avoid ‘What this means’ / ‘Key takeaways’ style headings without an attached object.
  • ⚠️ takeaway at `outline[3].bullets[1]` — Framework takeaway test: ‘check on-site factors first: content quality, Core Web Vitals, E-E-A-T signals’ is generic SEO advice that existed before this announcement — it fails the ‘would this advice exist if this announcement hadn’t happened?’ test.
  • ⚠️ takeaway at `outline[3].bullets[2]` — ‘Re-evaluate the ROI of third-party toxic link audit tools’ is general advice not specific to this Mueller comment — it fails the takeaway test.
  • takeaway at `outline[3].bullets[3]` — ‘Use Mueller’s direct quote to reset client expectations’ is closer to announcement-specific but is really advice about client management that pre-exists this comment; borderline — can be tightened to reference the specific new information (TLD-level disavow, ‘be surprised’ quote).
  • · heading at `outline[4].h2` — ‘The Bottom Line’ is a generic wrap-up heading; framework prefers specificity.
  • paragraph_coherence at `outline[2].paragraphs[2]` — This paragraph strings together the March spam update completion time, Mueller’s messaging, and an inference about algorithmic maturity — three ideas in one paragraph, classic Tourettes-stringing.
  • · ai_tell at `outline[2].paragraphs[0]` — ‘consistently recalibrating practitioner expectations around link-based ranking signals’ is jargon-heavy filler phrasing that reads as an AI tell.
  • linking at `outline[0].paragraphs[1] + intro_html` — The Reddit source URL is linked in the intro AND in the blockquote attribution — framework says pick one link per claim per quote-block.
  • · ai_tell at `outline[4].paragraphs[0]` — ‘consistent and getting harder to ignore’ is a mild hype phrase; ‘increasingly difficult to justify’ restates the same idea from the dek and intro — restatement with synonyms.
Category SEO
SEJ STAFF Roger Montti Owner - Martinibuster.com at Martinibuster.com

I have 25 years hands-on experience in SEO, evolving along with the search engines by keeping up with the latest ...