Menu Close

No More ‘num=100’: How Google’s Change Affects SERP Results and Data Extraction

Table of Contents
No More ‘num=100’

Google has quietly disabled the &num=100 parameter — a small-looking change with big consequences for SERP scraping, rank tracking, and SEO reporting. This update affects how many results are returned per query and reshapes how performance data is collected. In this post, we’ll explain exactly what changed, why it matters, and how SEO professionals should adapt in a world without num=100.

What was “&num=100”? A Quick Refresher

  • Historically, adding &num=100 to a Google search URL forced Google to return up to 100 organic results on a single page instead of the default 10.
  • This capability was widely used by SEO tools, rank trackers, researchers, and agencies to collect deep SERP data in a single HTTP request. 
  • It allowed tools to scan positions 1 through 100 without paginating, reduce latency, and simplify competitor/keyword analysis workflows.

In short, &num=100 was the “bulk result” shortcut many in SEO quietly relied on.

What Changed: Google Disables num=100

In mid-September 2025, many SEO professionals first noticed anomalies:

  • Attempts to use &num=100 frequently defaulted back to just 10 results per page.
  • Some tools returned only 2 pages (≈ 20 results) or dropped results beyond page 2.
  • Reports and experiments from SEO tool providers confirmed Google had disabled or significantly limited that parameter.
  • Google has not issued a formal, public statement explaining the change (as of writing).

Timeline Snapshot

Date / PeriodWhat Happened
~ Sept 10, 2025First reports emerge of &num=100 not working fully
Sept 12–14, 2025Many sites begin seeing dramatic shifts in Search Console data
Post-mid Sept 2025SEO tools publicly acknowledge the need to adapt to paginated requests

Why Google Likely Made the Change

While Google hasn’t confirmed the reasons, multiple theories by SEO technologists align with observed effects:

  1. Reduce scraping and automated data collection
    Bulk scrapers used &num=100 to harvest extensive SERP data rapidly. Disabling it raises the cost and complexity of large-scale scraping
  2. Improve data integrity & reduce bot-induced inflation
    Many impressions and average position metrics may have been inflated by bots loading deep results pages that no real user visits. Removing num=100 helps align metrics to real human behavior. 
  3. Server load, resource management, and infrastructure control
    Serving 100 results per query demands more processing, bandwidth, and risk of abuse. Limiting to simpler paginated requests makes it easier to throttle, filter, and manage load.
  4. Push towards official APIs and controlled access
    By making scraping harder, Google may encourage greater use of sanctioned APIs (with quotas, billing, and controlled access).
  5. Align with evolving SERP UX (infinite scroll, AI features)
    As Google transitions toward more dynamic, AI-driven results and infinite scroll paradigms, rigid 100-result pages may clash with evolving UI/UX expectations. 

Taken together, disabling &num=100 appears to be a deliberate shift toward limiting mass scraping, enhancing data fidelity, and shaping how the SEO ecosystem accesses search results.

Effects on Performance Metrics & SEO Tooling

The removal of &num=100 doesn’t necessarily mean your site lost visibility — but it does mean the way you see visibility has changed.

1. Dramatic Drop in Impressions & Keyword Counts

  • In a study covering 319 properties, 87.7% of sites lost impressions in Google Search Console shortly after the change.
  • Similarly, 77.6% of sites lost unique ranking keywords — meaning fewer recorded queries appear in reports.
  • Short- and mid-tail keywords were hit hardest; many lower-ranking queries vanished from view.

These shifts create the illusion of dropping performance — but much of the change is in what Google chooses to record now.

2. Average Position Improves (Artificially)

  • Because Google no longer counts impressions for items deep in the SERP (beyond what users see), average position often “improves” — metrics shift upward.
  • In fact, many SEO pros now argue that prior “impression inflation” (via bots and scraping) dragged average position downward artificially.

3. Clicks Remain Relatively Stable

  • Because the change affects how impressions are counted (not how users click), many sites see little change in click counts.
  • This supports the idea that much of the impression shift comes from invisible or bot queries.

4. Increased Cost & Complexity for SEO Tools

  • Tools that once fetched 100 results in a single call now must issue 10 separate paginated requests, increasing server load, API costs, and latency. 
  • Some tools reported missing data, gaps in ranking coverage, or errors as the change rolled out. 
  • Several platforms (e.g., Keyword Insights) acknowledged the increased cost (10×) in adjusting to this new model.

Interpreting the Change: What Actually Happened

Many SEO professionals are rethinking past anomalies in reporting — particularly the so-called “Great Decoupling” (when impressions rose without matching clicks).

Brodie Clark argues that recent metrics distortions may stem partly from bot-generated impressions via &num=100, not purely from changes in user behavior or AI Overviews. With the removal, Google is stripping away those inflated metrics — so post-change data may better reflect real human visibility.

In other words:

  • Pre-change: Some impressions were phantom — counted because bots scraped deep results that users never saw.
  • Post-change: Only impressions that can realistically occur (on pages users might view) are recorded.
  • The result: leaner data, but arguably more trustworthy.

What SEO Teams Should Do Now

The change may seem disruptive, but it’s also an opportunity to re-focus. Here’s how to adapt intelligently:

Reset Baselines & Recalibrate Expectations

  • Treat mid-September 2025 as a turning point. Don’t compare post-change data directly with old metrics without context.
  • Expect lower impression volumes and higher average positions by default.
  • Communicate to stakeholders: “This is a reporting methodology shift, not necessarily a performance drop.”

Shift Focus to Real User Metrics

Impressions become less reliable on their own. Prioritize:

  • Clicks (Google Search Console)
  • Organic traffic & sessions (GA4 or other analytics)
  • Engagement metrics (dwell time, pages per session, bounce rate)
  • Conversions & revenue from organic sources

These carry more weight now that superficial, deep-level impressions are no longer inflating numbers.

Audit & Update Your SEO Tools

  • Check whether your rank tracker or SERP tool has adapted to avoid relying on &num=100.
  • Ask tool vendors for transparency on how they fetch deep result data now.
  • Reduce or prune tracking of extremely low-volume, low-priority keywords that were only visible due to num=100.
  • Consider integrating official APIs (e.g., Google’s Custom Search API or specialized SERP APIs) that comply with Google’s policies.

Use Paginated / Incremental Scraping Strategically

  • Instead of one bulk call, paginate: fetch page 1, page 2, etc., as needed.
  • Use throttling and caching to reduce query cost.
  • Monitor only the first 20–30 results for the most meaningful impact — deep positions rarely drive traffic.

Annotate Reports & Use Clear Disclaimers

  • Add annotations in dashboards, noting “num=100 parameter removed mid-Sept 2025”
  • Avoid misleading comparisons across the change
  • Educate clients that sudden drops in impression or keyword count are likely artifacts, not site issues

Focus On What You Can Control

  • Optimize for top 10 / top 20 rankings (the positions users actually see)
  • Improve click-through rate (CTR) via title & meta optimizations
  • Invest in content quality, relevance, and user experience
  • Monitor SERP features (AI Overviews, featured snippets) more closely — they now may play a larger role in visibility

Example Scenario: What a Real Reporting Shift Might Look Like

Let’s say a keyword “best SEO tools” previously showed:

  • Impressions: 100,000
  • Average position: 65
  • Clicks: 2,000

After num=100 is disabled:

  • Impressions drop to 25,000
  • Average position “improves” to 15
  • Clicks stay ~2,000

This doesn’t mean you are suddenly ranking much higher — it means that now only impressions for results people could realistically see are counted. Deep-scrolling “impressions” are no longer on the books.

Frequently Asked Questions (FAQs)

Why did my impressions drop suddenly in Google Search Console?

Most likely because Google stopped honoring &num=100, removing deep-level, bot-driven impressions that were artificially inflating counts.

Not necessarily. Many rankings in the visible range (pages 1–2) often remain unchanged. What changed is how Google records visibility. Reports look weaker, but your real ranking presence may be intact.

Reset benchmarks after Sept 2025, annotate the change, and emphasize metrics like clicks, organic sessions, and conversion rates. Avoid interpreting impression drops as failures.

Rank trackers and SERP scrapers that relied on bulk &num=100 requests. Many had to change their architecture, increase cost, or temporarily lose coverage.

Some SERP APIs (e.g., Google’s Fast Light API via SerpAPI) still allow programmatic access, combining multiple calls under the hood. But you can’t rely on &num=100 in a simple browser URL request anymore.

Google has not confirmed permanence. But given how well this aligns with anti-scraping, infrastructure control, and improved data quality, most SEOs treat it as the new normal.

Final Thoughts

Google’s removal of &num=100 is more than a technical footnote. It reshapes how SEO tools fetch data, how reports appear, and how we interpret visibility. While metrics may look smaller, the upside is higher data integrity and closer alignment with real user behavior.

To thrive in this new era:

  1. Reset expectations and rebaseline around September 2025.
  2. Focus heavily on clicks, organic traffic, conversions, and real user engagement.
  3. Audit tools and adapt scraping strategies.
  4. Communicate clearly with clients and stakeholders about what’s changed.

By adapting intelligently, this is not a setback — it’s a chance to realign your SEO strategy around what truly matters: visibility to real users, not just inflated data impressions.

About the Author:
Picture of Sagar Rajput
Sagar Rajput
Sagar Rajput is a seasoned Link Building Specialist with over 8 years of hands-on experience in executing scalable, white-label link acquisition strategies for SEO and digital marketing agencies. He specializes in building high-quality, niche-relevant, and authority-driven backlinks that strengthen domain authority, improve keyword rankings, and support long-term organic growth. Throughout his career, Sagar has worked closely with agency partners across competitive industries such as SaaS, eCommerce, legal, healthcare, finance, iGaming, and local services, delivering consistent link building results under strict white-label processes. His expertise spans editorial link placements, guest posting, niche edits, digital PR links, foundational links, and competitor link gap analysis, ensuring every campaign aligns with Google’s evolving link quality guidelines. Sagar follows a data-driven and compliance-first approach, combining manual outreach, relationship-based placements, and advanced prospect qualification to secure links that are both search-engine safe and performance-focused. He is also actively involved in AI-assisted link prospecting, anchor text optimization, and GEO-aligned link strategies, helping agencies improve visibility not just in traditional search results, but also in AI-driven discovery platforms. As a contributor at Shrushti Digital, Sagar Rajput shares practical insights on modern link building, white-label SEO execution, and sustainable off-page strategies designed to help agencies scale without compromising quality or trust.

Get Your Free SEO Audit Now!

Enter your website URL below, and we’ll send you a comprehensive SEO report detailing how you can improve your site’s visibility and ranking.

Scroll to Top