How Site Speed Affects SEO

An article by Darrell Wilkins on 23 Mar 2021

Summary: Site speed plays a critical role in SEO. Relevance is the top-ranking factor. But how your site performs will affect your ranking in search engine results.

The relationship between website performance and SEO is straightforward. The higher performance it has, the better for SEO. To understand it fully, you need to know how search engines work and what we mean by performance.

What is web performance?

Web performance is about making websites fast.

Objective speed is how long things take. Perceived speed is how long users perceive things as taking. Both influence SEO performance.

Objective speed

There are all sorts of measurements we can do, including :

  • How fast your server responds.
  • The time it takes to download all the assets.
  • How long it takes your device to paint all those assets on the screen.
  • Javascript execution time.

As we'll see, all these have an impact on SEO performance and milliseconds matter. The critical thing to remember is the longer things take, the more resources they consume.

Perceived speed

There as been a lot of research into the psychology of human perception. We can use this insight to improve the experience your users have of your website.

A classic example is the progress bar. Imagine you are uploading a document. If you show someone a progress bar while the upload happens, your user will perceive that it takes less time than leaving the screen blank.

Perceived speed impacts conversions, user experience and brand perception. Indirectly it affects SEO performance too, and it's down to how Google measures it.

Core Web Vitals

A big part of Google's measurement of page experience is called Core Web Vitals. It comprises of three components :

Loading performance

Google measure loading performance by the time taken for the Largest Contentful Paint (LCP).

Largest Contentful Paint marks the point where your page's primary content has loaded. The faster this happens, the better, as it helps reassure users the page is useful.


First Input Delay (FID) is a measure of how responsive your site is when users try to interact with it. It's how long it takes for something to happen when someone clicks on a link, moves a slider or swipes an image. If an action takes longer than 100ms to complete, it should be improved.

Visual Stability

One of the most frustrating things that happen on a website is when elements move about unexpectedly. It is especially annoying as the page loads. Google measures this with what they call Cumulative Layout Shift (CLS). The more stable a page is the lower the CLS score. And lower is better.

It's about the experience.

You can measure your Core Web Vitals with Google's PageSpeed Insights tool. If your overall score is less than 80, then you have some work to do.

Core Web Vitals measure user experience, or rather how users perceive your site's performance. Remember that over 50% of internet traffic is on mobile devices; that is how Google is looking at your site and how you should measure your Core Web Vitals.

The good news is that if you nail your performance measurements on mobile, you won't have an issue with desktop machines.

How search engines work

There are lots of search engines. Google, Bing, Baidu, Yandex and DuckDuckGo are the most well known depending on where you are in the world. Google has a 90% market share, so we'll focus on them, but broadly they all work the same way.

When you search, Google returns pages which they believe best matches your query's intent. The most relevant articles appear at the top of the page. And there is a lot of competition for those top slots.


There isn't a central repository of all web pages, so Google needs to find them. For this, it uses Googlebot. It's sophisticated software that reaches out to the internet and tries to make sense of it. Sometimes called a web crawler or spider, it's Googlebot's job to follow links and visits webpages. It then reads the pages, checks to see if there is new content, and adds that to the index.

There are around 4 billion webpages on the internet, with more created every day. Resources constrain even Google; they can't visit all those pages to see what has changed.

The effort Google expends visiting, reading and understanding your site is called a crawl budget. Two factors determine the crawl budget, Crawl Capacity Limit and Crawl Demand.

Crawl Demand is determined by how popular your site it and how often it's updated. These are not directly performance related; Crawl Capacity Limit is.

How speed affect crawling

Googlebot visits your website in a similar way your users do with a browser. It connects to the server, asks for the main HTML page, downloads all the resources (CSS, JavaScript, Images) and then renders the page.

The longer a page takes to load and render, the more resources Google has to spend on it. Pages that load faster use fewer resources—the fewer resources used for each page load, the higher your crawl capacity limit.

A higher crawl capacity limit means all else being equal; Googlebot will crawl more pages on your site.


If crawling is the process of finding and reading pages, indexing is the process of understanding them.

Googlebot processes each page to try and understand the content. It looks at the pages textual content, photos, videos and, among other things, various markup elements such as <title> tags and 'alt' attributes.

Once it understands the content, Google stores the information in its index, a massive database on hundreds of thousands of servers.

As Googlebot has already downloaded your pages, page speed doesn't affect the indexing process. The way you construct your pages does. Well-structured content in simple semantic HTML is far easier for Googlebot to understand.

Here's the thing. Well constructed web pages are usually fast loading pages. By building them properly, you improve site speed and make it easier for Googlebot to index them.


Ranking is the process of matching your search query with pages in the index. Googles algorithm first goes to work on your query to try and understand your intent.

For a typical query, there may be thousands or millions of pages that have relevant information. Google analyses hundreds of different factors to try and surface the best results. The most important ones include:

  • Content quality. Well researched and thought through content will outperform spammy keyword laden fluff.
  • Content freshness. When was it created or last updated?
  • Mobile friendly. Will mobile users have a good experience on your site.
  • Your website's authority. Your site is more trustworthy and authoritative if many others link to you, especially if those sites are also trusted.
  • User experience. Google measures the experience that users have on websites. The better the experience, the higher you rank.

Google does not use site speed directly as a ranking factor. After all, a blank page would be super fast but not that useful. Instead, it uses Core Web Vitals, its measure of page experience of which performance is a big part.

It is rare for Google to talk about changes to their ranking algorithm ahead of time. Their announcement that from May 2021, Core Web Vitals will be a ranking signal is significant. If Google is taking web performance that seriously, so should you.

If you want to improve your rankings

Google can only rank pages that it has indexed. And it can only index pages that it has crawled. You can think of it as a funnel. The more Googlebot crawls, the more available for indexing. The more it can index, the more pages that can rank. The more pages that can rank, the higher the probability of matching your page against a users query.

Improving your website performance means more of your site goes through that funnel.

If you want to rank higher, build a high-performance website.