Submitted URL marked 'noindex': The fastest fix checklist (GSC)
What 'Submitted URL marked noindex' means in Google Search Console, the common causes (meta robots vs X-Robots-Tag), and how to validate the fix.
A practical map of Google Search Console indexing statuses (Coverage): what each status means, the most common root causes (canonicals, duplicates, robots, redirects, soft 404s), and the fastest way to validate fixes.
Key takeaways
- What 'Submitted URL marked noindex' means in Google Search Console, the common causes (meta robots vs X-Robots-Tag), and how to validate the fix
Contents
Start with the map:
Related (cluster):
What this status means
Google found a noindex directive on a URL you submitted (usually via sitemap).
That makes the status expected: you're asking Google to index something you're also telling it not to index.
Common causes
- meta robots: noindex
- header: X-Robots-Tag: noindex (easy to miss)
- template defaults applied to entire sections
Fix checklist
- Decide if the URL should be indexed.
- If yes: remove noindex (meta and headers) and make sure the page returns 200.
- If no: remove it from the sitemap and keep internal links pointing to canonical pages.
Validation
- URL Inspection: confirm "Indexing allowed".
- Re-submit sitemap after cleanup.
Tags
More reading
What 'Submitted URL has crawl issue' means in Google Search Console, the common underlying causes (robots, redirects, 4xx/5xx, rendering), and a step-by-step debug flow with validation.
A practical guide to "Submitted URL blocked by robots.txt": how to decide if the URL should be indexed, how to unblock safely, and how to avoid keeping bad URLs stuck in the index.
- GSC Indexing Statuses Explained: What They Mean and How to Fix Them (2026)
- Page with redirect (Google Search Console): What it means and how to fix it
- Redirect loop: How to find it and fix it (SEO + GSC)
- GSC redirect error: The fastest fix checklist (chains, loops, and canonical URLs)
- Submitted URL blocked by robots.txt: What it means and what to do (GSC)
- robots.txt unreachable: Why it happens and how to fix it