Google updated its documentation on how much of your web pages Googlebot will crawl. The SEO community started arguing about it. The hustlebros started preparing their “URGENT: Your website might be too big for Google” posts.
Let me save you some time.
What Google Changed
Google clarified that Googlebot will crawl the first 2MB of HTML files and the first 64MB of PDFs. There’s also a general 15MB default for Google’s other crawlers.
Google called this a documentation clarification, not a behavioral change. Whether you believe that is up to you. Either way, the practical impact is the same: this doesn’t matter for the vast majority of websites.
Why 2MB Is Massive
The 2MB limit applies to raw HTML. That’s just the markup and text, not images. CSS and JavaScript files are fetched separately and each has its own 2MB limit.
According to HTTPArchive’s 2025 data, the median HTML size for a web page is 22KB. At the 90th percentile, it’s 155KB. To hit Google’s 2MB limit, you’d need a page roughly 90 times larger than a typical web page.
To put that in perspective, 2MB of HTML is around 2 million characters. That’s roughly the length of a 400-page novel crammed onto a single page. If your services page is longer than a romance novel, you have bigger problems than Google’s crawl limits.
Who This Actually Affects
Genuinely massive websites with dynamically generated pages that pile enormous amounts of HTML onto a single URL. Think sprawling e-commerce sites with thousands of product variations loaded onto one page, or web apps that dump entire databases into the page source.
Your business website? Your blog? Your service pages? Not even close.
How to Check If You’re Curious
Right-click on any page of your website and click “View Page Source.” In Chrome, you can open Developer Tools (F12), click the Network tab, reload the page, and look at the size of the HTML document.
It’ll almost certainly be well under 100KB.
If you’re already using Screaming Frog, run a crawl and check the HTML size column in the Internal tab. You can also export to filter and sort by size if you want to find your largest pages. For most sites, even the heaviest pages won’t come close to 2MB.
John Mueller’s method is even simpler: search for a quote from further down a page to see if Google indexed it. If the text appears in search results, Google crawled that far. No need to measure bytes.
The Bottom Line
Every time Google updates documentation, a wave of “experts” descend on business owners with urgent warnings about how their website is at risk. They package it into a scary-sounding audit, charge a few hundred dollars, and tell you things you didn’t need to worry about in the first place.
If someone contacts you about your HTML file size being a problem, ask them what your current HTML file size is. Not the total page size. Not the images. Just the HTML. If they can’t tell you, they’re selling fear.

