Pingdom Home

US + international: +1-212-796-6890

SE + international: +46-21-480-0920

Business hours 3 am-11:30 am EST (Mon-Fri).

Do you know if your website is up right now? We do! LEARN MORE

Web designers, keep those page sizes down. It’s for your own good.

SlowYou probably hate slow websites. So do we, and it’s pretty safe to say that it’s a universal rule.

There are a number of factors that can make a web page slow to load, both on the client side (the browser) and on the server side, but one really big factor is page size, and that’s what we’ll be talking about in this article. Hopefully you’ll pick up some useful information along the way.

In the early days, web pages used to be just a few kilobytes, or a few tens of kilobytes at the most. Since then that has grown to hundreds of kilobytes, and many of today’s web pages can weigh in at more than a megabyte. A couple of years ago we did a study that showed exactly that, and although that specific study was of blogs, we have seen that it applies to websites in general.

In some ways we can afford this increase in page size, because people’s Internet connections have gotten faster. Broadband Internet connections are becoming increasingly common.

For example, it wouldn’t be surprising to hear that you, dear reader, is sitting behind an excellent Internet connection.

Don’t assume everyone has fast Internet access

So, why should web designers (or webmasters) worry at all about page size? It’s not an issue anymore, is it?

Wrong.

Even if you exclude people on dial-up, there are still plenty of people around on poor broadband connections. Mobile broadband is also on the rise, and depending on provider and coverage, data transfer rates can drop down into the lower ends of the spectrum. Basic 3G, for example, is only 384 kbit/s.

According to Akamai’s State of the Internet report, 28% of Internet connections in the United States are slower than 2 Mbit/s, and only 30% are 5 Mbit/s or faster. Almost 3% are slower than 256 kbit/s.

That’s in the United States, a country relatively far ahead on the Internet. The worldwide average is worse: 41% of Internet connections are slower than 2 Mbit/s, 22% are 5 Mbit/s or faster, and 4.6% are slower than 256 kbit/s.

Bandwidth distribution

So, you may be sitting on a super-fast Internet connection yourself, but many of your potential site visitors won’t be. If you don’t keep this in mind while creating your web pages, you may be turning them away without even knowing it.

Download speed vs. page size

So now that we’ve established that a lot of people are still using less-than-stellar Internet connections, how does page size affect download speed? We’ve created a couple of charts to illustrate this for you.

How page size affects download speed

As you can see, as page size approaches and passes 1 megabyte, things start to get ugly. Even at 500 kilobyte a page is far from fast for everyone.

You should note that these are download speeds under ideal conditions, i.e. theoretical maxima for these bandwidths. In the real world, it will be a bit slower. For returning visitors, caching may help you out, but don’t count on it. Another killer is that your web page is split up in many objects (images, scripts, CSS files, etc), each retrieved separately, which adds a lot of back and forth communication with the server and slows down proceedings even more. That, incidentally, is why another important performance rule is to keep the number of objects down to a minimum.

So, what if you don’t just consider people on broadband, but also those on dial-up? Then download time degrades much more rapidly. The below chart is exactly the same as the previous one, but with a 56 kbit/s dial-up connection added. You can probably tell which line it is…

How page size affects download speed, now with dial-up

You may not want to accommodate today’s minority of dial-up users, but at least now you know how they’ll be affected.

Settling for under 5 seconds

To be frank, you don’t want a website with pages that take 10-15 seconds or more to download. It will frustrate your visitors a lot more than you want. Let’s settle on a more user-friendly download time: 5 seconds. (And even that is generous.)

If you want a web page to be downloaded in under 5 seconds, how big can it be? I.e. how much wriggling room do you have before things start to get really slow?

In 5 seconds…

  • 56 kbit/s can download 34 kilobyte.
  • 500 kbit/s can download 305 kilobyte.
  • 1 Mbit/s can download 610 kilobyte.
  • 5 Mbit/s can dowload 2.98 megabyte.
  • 10 Mbit/s can download 5.96 megabyte.

And 5 seconds isn’t even considered a fast load time. Preferably a page should load within a couple of seconds at the most, which cuts the margins even more.

Don’t alienate users, think small

Unless you want to alienate a big portion of today’s Internet users, remember to keep track of how large your web pages become when you create them. If you’re unsure, you can always use tools like Firebug, Yslow, and even our own Full Page Test.

If it wasn’t already abundantly clear, page size only becomes a major problem for slower connections. For the faster broadband connections, 5 Mbit/s and up, page size has become almost a non-issue. When you have access to that kind of bandwidth, other factors than page size have a bigger effect on load time, so it’s important to think about that as well. Both Google and Yahoo provide performance best practices that you can apply to your site. (Here are Google’s. Here are Yahoo’s.)

Just don’t forget about page size. It’s low-hanging fruit on the performance tree.



22 comments
Ben Slade
Ben Slade

Sometimes, size isn't the main factor slowing down a website. The article in ACM.org's Queue magazine (http://queue.acm.org/detail.cfm?id=1466450 ) describes performance degrading interactions between javascript, css, and web protocols. Apparently, many well known websites suffer from these problems. The article lists 14 prioritized rules: 1. Make fewer HTTP requests. 2. Use a CDN. (Content Delivery Network) 3. Add an Expires header. 4. Gzip components. 5. Put stylesheets at the top. 6. Put scripts at the bottom. 7. Avoid CSS expressions. 8. Make JavaScript and CSS external. 9. Reduce DNS lookups. 10. Minify JavaScript. 11. Avoid redirects. 12. Remove duplicate scripts. 13. Configure ETags. (Entity Tags: page version info other than date modified) 14. Make Ajax cacheable. The tool YSlow ( http://developer.yahoo.com/yslow ) can be used to analyze a website for these problems (requires Firebug ) Ben

Alex
Alex

Two notes: I don't think average connection speed is what we should be looking at. We should instead be looking at the number of slow users per country, those below 256 kpbs or 512 kbps. The average means very little, since its skewed by all those people with fast connections who would be almost just as well served for normal web browsing with a 1-3 Mbps connection or so. Second, again I am amazed at how much JavaScript is on this page - 1 megabyte did you say? I don't see a single thing on this page that needs JavaScrip offhand. I could see if you needed to add a small Analytics script, but why all the rest? One thing many people don't realise is that even if JavaScript is GZipped and the cached, the browser still needs to parse the script on each page load. I remember some older studies showing that the iPhone 3 took about as long to parse the JavaScript as it did to download it. I think many people wrongly think that JavaScript libraries like Mootools and JQuery are actually helping them. Wrong! They are just making your life harder because its easier just to learn straight JavaScript -- and it can do everything those libraries do too -- than use these slow-to-download-and -run tools.

Chandra
Chandra

That worldwide average is highlyy misleading if it lumps developing nations with poor infrastructures ahead of the very technologically advanced nations of northern Europe: "not America" does not automatically mean "all alike"...

marrtins
marrtins

100mbit (up/dow) optic Riga/Latvia - 15Eur/month. I feel good, na na na na ;)

Patrick
Patrick

"I think this shouldn’t matter. A website has to look more and more like a hollywood production." If a Hollywood production took 20 seconds to load up a single screen, I wouldn't watch it... My personal guideline is too keep the total size of a page under 300k, but of course it depends on your audience and your site. A tech site (like this one) can afford to go higher; a site for mobile users needs to be smaller.

Graphic design tamworth
Graphic design tamworth

I agree, firebug is a great tool for checking loading speeds. I've found reducing the amount of javascript on a page can help a lot. Thanks for the article.

doug
doug

the first thing I did when I finished reading this blog was open up yslow, pagespeed and google chromes developer tools. Even if it's all gzipped on the fly, you should consider combining your javascript files into less files. Browsers only have a limited number of connections they can make to a site and utilizing asset management will mean it can finish pulling down your assets (css, js, images) faster. What are you using XFBML for on your site? It's the beast in terms of filesize..

SeriousWorm
SeriousWorm

Hi, kind of ironic that this page loads at around 7-8 seconds on my idle 4Mbps connection, don't you think?

Eric
Eric

Also putting facebook widget will slow...oops.

Daniel
Daniel

aw crap; wordpress ate my code: # compress the files AddOutputFilterByType DEFLATE text/html text/plain text/xml text/css text/javascript application/x-javascript # removes some bugs BrowserMatch ^Mozilla/4 gzip-only-text/html BrowserMatch ^Mozilla/4\.0[678] no-gzip BrowserMatch \bMSIE !no-gzip !gzip-only-text/html Header append Vary User-Agent

Daniel
Daniel

add the following lines to your .htaccess to enable gzip for your html, css, js and xml files: === 88 =========== this can shrink your filezizes down by up to ~80%, especially of you make heavy use of css and text and don't use to much images.

Henrik
Henrik

I wonder if there also an significant enviromental benefit from using less "heavy" websites. Or if it's neglectible...

T
T

... and as always US vs. World. ;) btw. 100mbit ADSL in germany: 50 euro/month - check the prices in the UK/austria etc. ... much better than 50mbit cable which is only 15mbit. (comcast, cox, etc. suck huge balls - advertising for Xmbit but delivering less than half)

lljd
lljd

It needs to be said that these download numbers are unrealistic. Browsers never download at full speed. They waste time waiting for dns, connections, servers, script execution. On broadband >20% utilisation requires tricks.

Thomas
Thomas

Keeping websites small is not only important for the user, but search engines place more and more emphasis on speed as well. If Google has to wait 5 seconds before it's bot can crawl your site you might well lose out on some good rankings.

MaryZ
MaryZ

I think this shouldn't matter. A website has to look more and more like a hollywood production.

Aaron Skom
Aaron Skom

I have Qwest DSL here in the United Staes ($60+ /month) and they cap at 100 KB a second. I sometimes give up on my the DSL and use my phone to browse the internet. I don't blame the web designers I blame the home internet providers.

marrtins
marrtins

Said post, who has almost 1MB of JS, 155K of CSS, 200K of images, 50K of HTML. Check with yslow. WARNING: this may hurt your feelings ;)

Joseph Scott
Joseph Scott

Getting more sites to support HTTP compression would also help with this.

Kalle Gustafsson
Kalle Gustafsson

.. and don't forget that these numbers are very optimistic. In reality, the theoretical max speed is seldom reached, and even when it is, protocols are consuming bandwidth too.. In other words you won't really be able to get 34 kilobytes of page data in 5 seconds with a 56k connection. But on the other hand - this fact only strengthens the argument behind this article!

Raymond Crandall
Raymond Crandall

A lot of people resort to the overuse of div elements to make CSS easier; without putting the time into making your CSS inherit and share shorthand statements; it all adds up.

Pingdom
Pingdom

@marrtins: Well aware of that. ;) Clearly something we need to look over on this blog. (Although the numbers you're looking at is a bit misrepresentative, especially for JS, since most of it's gzipped. You're looking at unpacked numbers.)