If you’ve spent any time in web development or digital marketing lately, you’ve probably noticed something: websites just feel bigger. More complex. Slower on a bad connection. That’s not your imagination—web pages have genuinely ballooned in size over the past decade, and it’s worth having an honest conversation about why that still matters, even in an era of faster devices and better broadband.
“Page weight” is everything a browser has to download to fully render a page—HTML, CSS, JavaScript, images, videos, fonts, third-party scripts, all of it. In 2015, the average mobile homepage weighed around 845 KB. Today that number sits above 2,300 KB. Nearly three times heavier, and still climbing. To put that in perspective, that’s a lot of data to push through a connection before a user sees anything useful.
Some of that growth makes sense. Modern websites do a lot more than they used to—high-res images, embedded video, live chat, analytics, smooth animations, personalization layers. Users expect a polished experience, and businesses want to deliver one. The problem isn’t ambition; it’s the assumption that adding more is always an improvement. At some point, those layers of functionality start working against the very experience they’re meant to improve.
The most obvious casualty is load time. Yes, internet speeds have improved significantly over the last decade. But not equally, and not everywhere. A developer with fiber internet testing their own site from a brand-new laptop is going to see a very different experience than someone loading that same page on a mid-range Android phone over a rural 3G connection, or on a prepaid plan where every megabyte costs real money. Designing only for the best-case scenario isn’t really designing for your users—it’s designing for yourself.
And even on fast connections, speed still matters more than most people assume. Studies consistently show users start abandoning pages after just a couple of seconds of waiting. If your site takes five seconds to show anything meaningful, a significant chunk of your audience is already gone—along with whatever action you were hoping they’d take. Faster pages don’t just feel better; they convert better, retain users longer, and reduce bounce rates in measurable ways.
There’s also the SEO angle, which often gets underplayed. Google’s crawlers don’t have unlimited time or bandwidth to spend on any given site. If your page is bloated and your most important content is buried deep, there’s a real chance it doesn’t get fully crawled or indexed. Beyond that, Google has been leaning harder into page experience as a ranking factor—speed, responsiveness, visual stability. A heavy page tends to underperform on all three, which can quietly drag down your visibility in search results even when your content itself is strong.
One thing that often gets lost in this conversation is structured data. Schema markup is genuinely useful—it helps search engines understand your content and can unlock rich results like star ratings, FAQs, and event details in search listings. But it’s still code, and code has weight. It’s worth stepping back and asking whether you’re adding markup because it actually serves your users and your search goals, or just because it showed up on a best-practices checklist. Not every schema type is relevant to every page, and piling it on without purpose contributes to bloat without meaningful return.
The answer isn’t to gut your site down to plain text and system fonts. Nobody wants that, and it wouldn’t serve your users either. The goal is intentionality—being deliberate about what you include and why. Does that hero image really need to be 4 MB, or would a well-compressed 400 KB version look nearly identical to 95% of your visitors? That third-party script from an analytics platform you signed up for two years ago and rarely check—is it still earning its place on every page load?
There are real, straightforward wins available to most sites without sacrificing quality or visual appeal. Image optimization is usually the biggest lever—modern formats like WebP, proper compression, and responsive sizing can cut page weight significantly with almost no visible difference to end users. Lazy loading ensures resources only get fetched when they’re actually needed, rather than all at once on initial load. And a periodic audit of your JavaScript, especially third-party tags and tracking scripts, tends to surface a surprising amount of dead weight that’s been quietly slowing things down without anyone noticing.
None of this requires a full rebuild or months of engineering work. A lot of it is just paying attention—making performance part of the conversation when new features are being scoped, treating it as a design constraint rather than something that gets bolted on as an afterthought after launch. Teams that build performance into their process early tend to ship faster sites consistently, rather than playing catch-up every time a page starts feeling sluggish.
It also helps to set a performance budget and actually stick to it. Decide upfront what your acceptable page weight is, what your target load time looks like on a mid-range device, and use those numbers as a guardrail when making decisions. It sounds simple, but having a concrete number to point to changes the conversation. Suddenly “do we really need this?” has a measurable answer.
Page size isn’t a niche concern for developers to debate in the background. It’s a user experience issue, an accessibility issue, and increasingly a competitive one. The sites that load fast, work well on modest hardware, and don’t penalize users with limited data plans are the ones that build real trust and loyalty over time. That’s worth keeping front of mind every time you’re deciding whether to add one more script, one more widget, one more layer to the stack. Performance isn’t the opposite of a great experience—it’s a core part of what makes one.
