3.055 min read

Blocked due to other 4xx: The real causes and the fix checklist (GSC)

By Official

Key takeaways

  • A practical guide to 'Blocked due to other 4xx' in Google Search Console: what codes it usually hides (410/429/451/401/403), how to choose the right strategy, and how to validate fixes

Start with the map:

Related:

What "Blocked due to other 4xx" usually means

Googlebot received a 4xx response that is not a plain 404.

GSC groups many things into this bucket, including:

  • 410 Gone (intentional removal)
  • 401 Unauthorized (auth required)
  • 403 Forbidden (access blocked)
  • 429 Too Many Requests (rate limiting)
  • 451 Unavailable For Legal Reasons
  • custom 4xx responses from middleware/WAF/CDN

The fix depends on the intent of the URL.

Step 0: classify the URL (keep / move / remove)

This classification decides whether the 4xx is a problem.

  • Keep: should be 200 and indexable
  • Move: should 301 to a true successor
  • Remove: should be 410 (or remain 404) and be removed from sitemap

If you want the deeper playbook:

Common causes (and what they really mean)

410 Gone (intentional)

This is usually fine for legacy URLs, but not in a sitemap.

If you see 410 in the report and the URL is not important:

  • remove from sitemap
  • remove internal links
  • keep 410 consistent

401/403 (access control)

This is usually not an SEO content issue. It is:

  • WAF/CDN rules
  • geo restrictions
  • auth gates
  • bot protection

If the page should be indexed, 401/403 is a real problem.

If the page should not be indexed, remove it from sitemap and keep access consistent.

Deep dive:

429 (rate limiting)

If Googlebot gets 429 repeatedly, crawling slows down.

Typical causes:

  • aggressive rate limits
  • bot protection rules
  • low resource limits under load

If you intentionally block certain content, keep it consistent. Do not include these URLs in sitemaps.

Fix checklist

  1. Find the exact 4xx code (logs are best; URL Inspection often helps).
  2. Decide intent: should it be indexable?
  3. If indexable:
    • return 200 for Googlebot
    • remove WAF/geo/auth blocks
    • make the canonical URL stable
  4. If not indexable:
    • remove from sitemap
    • remove internal links
    • keep the chosen status consistent (often 410 for intentional removals)

Validation

  • URL Inspection: confirm Googlebot can fetch (or confirm the intentional status).
  • Check that sitemap contains only 200 OK canonical URLs.

FAQ

Is 410 better than 404?

410 is a clearer signal for intentional removals. But the bigger win is not the code; it is removing dead URLs from sitemap and internal links.

Should I redirect everything to the homepage?

No. That often creates soft-404 patterns. 301 only to true successors.

Quick decision table (by status code)

  • 410: usually fine for intentional removals; remove from sitemap
  • 401: if it should rank, remove auth gate; if not, remove from sitemap
  • 403: if it should rank, fix WAF/geo blocks; if not, keep consistent
  • 429: relax rate limits for Googlebot; reduce crawl-triggering chains
  • 451: keep consistent and do not submit these URLs

What NOT to do

  • do not 301 unrelated URLs to the homepage (often becomes soft-404)
  • do not keep blocked URLs in the sitemap
  • do not mix signals (internal links to URLs you block)

How to confirm it's really Googlebot

If you operate a WAF, do not rely on user agent strings alone.

Use one of:

  • your provider's verified bot feature
  • reverse DNS verification for Googlebot

This prevents fake bots from getting whitelisted.

Next in SEO & Search

View topic hub

Up next:

Server error (5xx) in Google Search Console: Debug checklist

A practical guide to the GSC status 'Server error (5xx)': how to diagnose timeouts and intermittent failures, prioritize fixes, and confirm recovery.