Blog

Submitted URL blocked by robots.txt: What it means and what to do (GSC)

0.665 min read/
/

A practical guide to "Submitted URL blocked by robots.txt": how to decide if the URL should be indexed, how to unblock safely, and how to avoid keeping bad URLs stuck in the index.

Subscribe
Get new essays via Substack or RSS. Start with the guided path if you are new.
Start with the main guide
GSC Indexing Statuses Explained: What They Mean and How to Fix Them (2026)

A practical map of Google Search Console indexing statuses (Coverage): what each status means, the most common root causes (canonicals, duplicates, robots, redirects, soft 404s), and the fastest way to validate fixes.

Key takeaways

  • A practical guide to "Submitted URL blocked by robots
  • txt": how to decide if the URL should be indexed, how to unblock safely, and how to avoid keeping bad URLs stuck in the index

Contents

Start with the map:

Related:

What this status means

You submitted a URL (usually in a sitemap), but robots.txt blocks crawling.

Important nuance: robots blocks crawling, not indexing. Blocking can also prevent Google from seeing a noindex directive.

Pick a strategy

Strategy A: you want it indexed

  • remove the disallow rule for that path
  • ensure there is no noindex
  • request indexing only for core pages

Strategy B: you do NOT want it indexed

Best practice:

  • allow crawling
  • add noindex so Google can see the directive
  • remove the URL from the sitemap

Validation

  • Test robots.txt in GSC.
  • URL Inspection: make sure Googlebot can fetch the page.

Tags

More reading

Next in SEO & Search
View topic hub
Previous
Submitted URL marked 'noindex': The fastest fix checklist (GSC)

What 'Submitted URL marked noindex' means in Google Search Console, the common causes (meta robots vs X-Robots-Tag), and how to validate the fix.

Up next
robots.txt unreachable: Why it happens and how to fix it

A practical guide to "robots.txt unreachable": what Googlebot is seeing, common causes (timeouts, 403/5xx, WAF), and how to validate the fix in Search Console.