1.695 min read

Soft 404 in Google Search Console: What it means and how to fix it

By Official

Key takeaways

  • A practical guide to "Soft 404" in Google Search Console: why Google labels 200 pages as not-found, the most common causes (empty templates, missing data, thin pages), and how to validate fixes

A soft 404 is when your server returns 200, but the page looks like "nothing" to Google.

It's one of the fastest ways to create crawl debt and "not indexed" noise.

Start with the status map:

Common causes

1) Missing content rendered as an empty template

Examples:

  • dynamic route has no data, but still returns a "nice" empty UI
  • a "post not found" message on a 200 page

Fix: return a real 404 (or 410 if intentionally gone).

2) Thin pages that don't satisfy any intent

Tokenized pages like {keyword} in {city} without substance are common soft-404 candidates.

Fix: either add real content/value or don't publish the page.

3) Wrong redirects that look like "bait and switch"

Redirecting unrelated old URLs to a generic page can look like a soft 404 pattern.

If you pivoted, do this instead:

Fix checklist

  1. For missing content: return 404/410 with a clear response.
  2. For thin pages: add real content (examples, steps, constraints) or remove.
  3. For template pages: ensure the main content exists in the HTML (SSR/SSG), not only after JS.
  4. For redirects: avoid chains and irrelevant destinations.

How to validate

In GSC URL Inspection:

  • check the rendered HTML (is the content real?)
  • check the screenshot (does it show an empty state?)
  • check status code and redirect chain

If you also see "crawled/discovered not indexed", clean soft 404s first:

FAQ

Is a soft 404 always a bug?

Not always. Sometimes it's Google telling you the page has no value. Treat it as a signal to either improve the page or remove it.

Should I block soft 404 pages in robots.txt?

Usually no. If you want Google to drop the URL, let it crawl and return the right status (404/410) or noindex.

Next in SEO & Search

View topic hub

Up next:

Indexed, though blocked by robots.txt: How to resolve it safely (GSC)

A practical guide to the GSC status "Indexed, though blocked by robots.txt": why it happens, why blocking can freeze bad URLs in the index, and the safest resolution paths (allow crawl + noindex vs allow crawl + index).