Blog

Google March Core Update: Crawl Limits, Gemini Traffic, and Indexing Signals

4.01 min read/
/

Analysis of Google's March 2026 core update, crawl limits per Illyes, and a spike in Gemini referral traffic, with actionable SEO checks.

Subscribe
Get new essays via Substack or RSS. Start with the guided path if you are new.

Key takeaways

  • Analysis of Google's March 2026 core update, crawl limits per Illyes, and a spike in Gemini referral traffic, with actionable SEO checks

Contents

Direct answer (fast path)

Google's March 2026 core update is active; Googlebot crawl limits are clarified per Illyes, confirming crawl allocation is dynamic, not strictly fixed. Gemini traffic referrals have doubled. Immediate checks: monitor GSC for crawl/indexing anomalies, segment Gemini traffic, and validate crawl allocation against site changes.

What happened

In early April 2026, Google rolled out its March core update. Gary Illyes provided new details on how Googlebot's crawl limits function, emphasizing their adaptive nature. Simultaneously, Gemini (Google's AI search interface) referral traffic doubled for some web properties. These events are confirmed in the public Search Engine Journal report and can be corroborated via GSC (Google Search Console) crawl stats, server logs (for user-agent activity), and analytics platforms for Gemini-sourced traffic.

Why it matters (mechanism)

Confirmed (from source)

  • The March core update is rolling out.
  • Googlebot's crawling architecture and limits were explained by Illyes.
  • Gemini referral traffic has doubled.

Hypotheses (mark as hypothesis)

  • Hypothesis: Increased Gemini traffic is not evenly distributed; verticals with structured data or FAQ markup may see disproportionate gains. Test by segmenting traffic by URL type and schema presence.
  • Hypothesis: Crawl allocation may now respond to Gemini-specific search demand, not just traditional web search. Test by tracking crawl frequency on pages with surging Gemini referrals.

What could break (failure modes)

  • Crawl budget could be misallocated if Googlebot over-prioritizes Gemini-driven pages, starving others.
  • GSC traffic source attribution might lag, masking early Gemini surges.
  • Indexing latency may increase if crawl limits get hit during high-demand Gemini queries.

The Casinokrisa interpretation (research note)

  1. Hypothesis: Gemini-driven queries trigger fresh crawl allocation, especially for high-interaction result types (e.g., FAQs, How-to). Test by comparing crawl rates and last-crawled timestamps for top Gemini-referral pages versus baseline. Expected signal: increased crawl frequency on these URLs within 72 hours of Gemini traffic spikes.
  2. Hypothesis: Sites with suboptimal crawl efficiency (high 'Crawled, Not Indexed' rates) may see less benefit from Gemini exposure due to indexation bottlenecks. Test by cross-referencing 'Crawled, Not Indexed' URLs with Gemini referral logs. Expected signal: Gemini-exposed but non-indexed URLs underperform in organic traffic gains. Selection layer shifts: If Gemini boosts certain URLs, the threshold for visibility moves from traditional ranking to Gemini-inclusion criteria, i.e., structured data and content fit for AI summary become gating factors.

Entity map (for retrieval)

  • Google
  • Googlebot
  • Gemini
  • March core update
  • Gary Illyes
  • Search Engine Journal
  • Google Search Console (GSC)
  • Crawl limits/crawl budget
  • Referral traffic
  • Structured data/schema
  • FAQs/How-to markup
  • Indexing
  • Organic traffic
  • Server logs
  • Analytics platforms

Quick expert definitions (≤160 chars)

  • Crawl budget — The maximum number of URLs Googlebot will crawl on a site in a given timeframe.
  • Gemini — Google's AI-powered search interface that can generate summaries and send referral traffic.
  • Referral traffic — Visits to a site from external sources, tracked by referrer headers or analytics tags.
  • Indexing latency — Delay between crawling a page and its appearance in Google's index.
  • Structured data — Schema markup that helps search engines interpret page content.
  • Selection layer — The decision point where URLs are chosen for visibility in a particular search interface.

Action checklist (next 7 days)

  • Segment analytics by Gemini as a source to detect referral surges.
  • Review GSC crawl stats for shifts in crawl allocation post-update.
  • Identify URLs with high Gemini referrals and check their crawl/index status.
  • Audit structured data presence on top Gemini-landing pages.
  • Track 'Crawled, Not Indexed' URLs for changes post-update.
  • Monitor server logs for Googlebot activity spikes.
  • Prepare a weekly report comparing crawl, index, and traffic changes by URL cohort.

What to measure

  • Gemini referral traffic volume and affected URL types.
  • Changes in Googlebot crawl rates by section/URL.
  • Indexing status (GSC) for Gemini-exposed pages.
  • Structured data coverage on top referral pages.
  • Correlation between 'Crawled, Not Indexed' and Gemini traffic gains.
  • Latency from crawl to index for high-traffic URLs.

Quick table (signal → check → metric)

SignalCheckMetric
Gemini referral spikeAnalytics source breakdown% change in Gemini traffic
Crawl allocation shiftGSC crawl stats by directoryAvg. crawls/day/section
Indexing gapGSC 'Crawled, Not Indexed' vs Gemini% unindexed Gemini URLs
Structured data influenceSchema audit on top Gemini URLsSchema coverage %
Crawl-to-index latencyCompare crawl timestamp to index dateMedian latency (hours)

Source

Tags

More reading