Site speed study: The fastest, slowest & is it important for SEO?

Sam Underwood
Sam Underwood
June 1, 2020
9 mins read

Despite my day-to-day being SEO, ever since Google introduced page speed as a ranking factor I've become a bit obsessive about finding ways to squeeze every bit of performance possible out of a site.

For an unrelated analysis I've been doing (that I may also release) I collated the top 2,200 sites in the UK by organic traffic within 22 different industries.

It sprung to mind that I could utilise some direct data from CrUX alongside additional insights from the PageSpeed Insights API for the origin of all the sites and see how they correlate with each sites organic performance.

And if you don't know what CrUX is...

The Chrome User Experience Report (CrUX) gives various user experience metrics gathered from real-world Chrome users. The data it gives is freely available and queryable from a BigQuery database.

The only downside, at the time of writing, not all tools are updated to use Google's new core web vitals metrics.

So it's currently missing some newer metrics such as CLS (cumulative layout shift) and LCP (largest contentful paint) that Google will soon use within their ranking algorithms.

But I have included first paint, DOMContentLoaded as well as the user-centric performance metric FCP (first contentful paint) found within older CrUX tables.

User-centric performance metrics?

User-centric performance metrics are important site speed measurement metrics, improving these are what is really important as they provide a noticeable benefit to users. Examples of these metrics are:

  • FCP (first contentful paint)
  • LCP (largest contentful paint)
  • FID (first input delay)
  • TTI (time to interactive)
  • TBT (total blocking time)
  • CLS (cumulative layout shift)

LCP, FID and CLS together form what Google calls core web vitals. Those that visit Google Search Console frequently will see these metrics reported in the report found in the left sidebar.

google search console core web vitals

I'll look into updating the data found in this blog post with core web vitals once API's have caught up with the changes.

So, for the rest of this post, I'll be running through some key insights from the data alongside some data viz.

Site speed and organic visibility do not correlate

One key thing that I found was there was no correlation between site speed and ranking performance.

When split down by industry, the correlation still wasn't there.

Is the lack of correlation a surprise?

To be honest, the correlation data above isn't something you or I should be surprised about. It was exactly what I expected the outcome to be.

Google has mentioned on multiple occasions it is a small ranking factor and I wouldn't be focusing on site speed for the potential ranking benefit anyway.

The fact is, highly trafficked domains didn't get popular by focusing on site speed. Although providing a great experience for users surely helped.

Still, I'm sure every hosting company out there will still be professing the massive benefit their hosting will have on organic performance.

Now, onto more interesting data.

What are the average speed metrics by industry?

In this data, you can see ecommerce sites actually have the lowest first paint, FCP and DOMContentLoaded timings of all sites.

I imagine this is because ecommerce sites focus more on site speed given they know it has a direct impact on their conversion rate.

As a well done to the top performers in ecommerce, here are the fastest most highly trafficked ecommerce sites:

Another interesting insight from the above is that you can also see that the average DOMContentLoaded timing tends to be much slower on news and media sites, which isn't surprising given their ridden with ads which impact site speed.

metro ads
Ads on the Metro when you first vist the site..... ????

Which industry has the slowest site speeds?

Rather than looking at averages to decide the slowest, I decided to use the PageSpeed insights API which gives a useful overall origin score categorising sites as either slow, average or fast.

This was to prevent a really slow site bringing down the overall average for each industry.

I've used this data to categorically say that the industry you're most likely to have a slow experience with is news and media closely followed by gambling:

What is quite interesting here is just how few mobile sites are considered to be fast.

Of the sites that are fast on mobile, ecommerce is again leading the way.

Whilst the bar is set quite high via PageSpeed Insights to achieve a 'fast' site on mobile devices, I'd have hoped for more.

As you'd expect, on desktop we start to see less slow sites show up and a larger increase of sites with an average speed.

Just as we praise the fast sites, we must also highlight the slow ones, here are the 10 slowest sites in news and media weighted by how much organic traffic they receive.

The Daily Mail tops the list as one of the most highly trafficked news sites is with an impressive average FCP of 5.1 seconds and a DOMContentLoaded time of almost 12 seconds.

daily mail ads
Which again, shouldn't be a surprise to anyone who visits the site...

What I do quite like about this data is that Google, who spend a lot of time promoting the benefits of site speed, has had their very own news domain on the list...

And the fastest sites?

You can also see law & government sites are the fastest sites around followed by ecommerce which we previously saw performed well based upon average speed metrics.

Here are the load timings of the fastest sites for law & government.

It looks like gov.uk sites must have a great tech team in place to ensure a great experience for users!

How are they getting such good speeds?

From taking a quick look, there are a few things that are causing gov.uk sites to be fast:

  • They're mostly text-based with minimal imagery
image 29
Content type analysis from Pingdom
image 30
  • All assets are cached on an edge server via the Fastly CDN. You can see in their HTTP headers even HTML isn't being served from an origin server and all sits on Fastly. This means the site is basically being server statically with no database calls that could slow down the initial load.
image 31
Caching HTTP headers on HTML from Fastly
  • Because everything is edge-cached, that means a fast experience for users across the globe.
image 32
Interactive timings in different countries via Fast Or Slow
  • The DOM size is really small with just 378 elements on the home page.

To improve timings further, gov.uk could start deferring CSS and JS that aren't required for load alongside inlining any critical CSS.

But they're already doing pretty great.

If you're looking for tips on improving the speed on your own site, make sure to check my filterable list of different ways to improve speed here.

The top 10 slowest sites for each industry

Next up I delved into which sites in each industry are the main culprits for a slower site speed.

I wanted to pick out some of the larger sites within each industry so I picked out the top 100 sites by organic traffic and then limited that data to the top 10 slowest sites (by FCP of the 90th percentile) for each industry.

Below are the results:

The top 10 fastest sites for each industry

And as the reverse of the above, here are the top 10 fastest sites for each industry, again weighted by how much traffic they receive.

Newer sites tend to be faster

I utilised the Sistrix API which has a handy call that specifies the age of a domain and I found that there is a bit of a trend that newer sites tend to perform better.

Given this dataset it based upon sites that perform well organically, it shows that newer sites that perform well organically also have slightly better site speed.

Which makes it seem like people are listening to Google on the importance of speed! ????

One other reason I can think of as to why this may be is older domains are more likely to have significant tech debt whilst they're stuck on older platforms.

Another is that site speed is getting easier to optimise due to things like plugins and CDN's that take the heavy lifting off of your origin server. Newer domains that are smaller, more agile and don't suffer from tech debt can take advantage of these more easily if they plan to from the get-go.

Whilst interesting, to concretely say newer domains tend to have a better site speed, we'd certainly need a larger dataset as this one is more heavily weighted towards older domains than fresh ones.

Coming soon, a CrUX dashboard

I'm planning to create a dashboard that utilises core web vitals to monitor different industries and trends on a monthly basis using the dataset I've run through in this blog post.

image 33
Very early screenshot of the speed monitoring dashboard I'm working on

I think given Google will soon be using core web vitals as a ranking factor, it'll be interesting to see if sites start to improve speed because of it.

Alongside that, it will be interesting if we spot an increased correlation between speed and organic traffic after the update has rolled out.

I'm currently custom coding it, but I may end up just using Data Studio to speed up the process.

Final words

Hopefully you've found this post interesting, it was a fun bit of analysis that I'll definitely like to refresh in the future.

In building this dataset, I've had a few more ideas for future posts so make sure to follow me on Twitter if you'd like to see more.

linkedin facebook pinterest youtube rss twitter instagram facebook-blank rss-blank linkedin-blank pinterest youtube twitter instagram