Why fixing Core Web Vitals is important
Thoughts from Google's Martin Splitt and Searchmetrics study on Core Web Vitals use for businesses
Intelligent design is the future
May is already over and Sapiens is the only book that I have read this year. Parking my laziness at bay, I get anxious when something good is coming to an end. Be it a nice movie or a book like Sapiens. I was wondering how would Yuval Noah Harari end Sapiens that focussed on the history of humankind.
Chapter 20 - The End of Homo Sapiens, Yuval writes: “Homo Sapiens is transcending limits. It is now breaking laws of natural selection, replacing them with the laws of intelligent design.”
“For billions of years, intelligent design was not even an option because there was no intelligence which could design things. The first crack appeared about 10,000 years ago, during the Agricultural Revolution.
Sapiens discovered that if they mated the fattest hen with the slowest cock, some of the offsprings would be both fat and slow. It was a race of chickens unknown to nature, produced by the intelligent design, not of a god but of a human.”
Today the challenges are different - The Human Brain Project founded in 2005, hopes to recreate a complete human brain inside a computer with electronic circuits in the computer emulating neural networks with the brain.
Intelligent design will help the animal overcome the god.
Now let’s come back to 2021. Good design makes the life of a consumer easy. According to a marketer, a good design provides a good consumer experience. But we are the same bunch of humans who would jump to follow trends because everyone else is doing in our group.
A few years ago almost all websites started having chatbots because some idiots thought that the consumer is dying to chat with a bot. Leave the operations part and see it from the lens of experience. I have fired a website on my smartphone and before I can browse the products or content; a chatbot pops on the screen asking questions. The struggle is to locate the end button.
Once a website gets all these fancy plugins or scripts running on the pages the marketing team never bothers to check what is the data saying and is it useful. Since we have paid it let it be there till the time it isn’t is a problem.
Now it is a problem with Google’s forthcoming update - Core Web Vitals.
Google’s Martin Splitt Q&A about Core Web Vitals
Google’s Martin Splitt says that the objective of the upcoming update is to make the internet faster and better.
Martin recently joined Search Engine Journal Founder Loren Baker in this live Q&A about Google Core Web Vitals, the delay to June for the Page Experience Update, and other overall performance and speed needs for websites to better compete within Google and convert users.
Watch the conversation since Loren takes questions from users about Google’s Core Web Vitals and occasionally talking about diving with Martin.
According to Martin, the rollout will not happen till June 2021 and subsequently, it will happen in batches for the next few months. However, there has been a kind of buzz that some publishers are seeing an increase in traffic.
Martin says it is neither. It’s not even a coincidence. Page speed has been a ranking factor before. It has nothing to do with Page Experience in this case.
“That makes a lot of sense actually. So kudos for getting your page sped up before the Page Experience update goes out.
You may be seeing an improvement in ranking because of those changes that you’ve done but not necessarily because of the Page Experience update.
Okay, that makes perfect sense!”
Searchmetrics Core Web Vitals study - April 2021
Meanwhile, Searchmetrics did a study by crawling over 2 million URLs across the Core Web Vitals and further metrics to understand:
How websites are performing.
Whether Google’s benchmarks are realistic and useful for businesses.
In my earlier post, I shared my thoughts about a similar study done by Backlinko. “53.77% of sites had a good Largest Contentful Paint (LCP) score. 46.23% of sites had “poor” or “needs improvement” LCP ratings,” says the Backlink study.
To do so the team analysed 208,085 web pages and established benchmarks for Cumulative Layout Shift, First Input Delay, and Largest Contentful Paint.
If you need to understand the basics of Google’s Core Web Vitals, how it works and how to improve the issues then please download this comprehensive guide by Search Engine Journal.
Searchmetrics study finds out code bloat is slowing websites down. “On an average sites can save almost one second by removing unused JavaScript.”
And optimizing images improves performances.
Understanding the Core Web Vitals ranking analysis
Largest Contentful Paint(LCP)
The Largest Contentful Paint (LCP) metric reports the render time of the largest image or text block visible within the viewport. To provide a good user experience, websites should strive to have Largest Contentful Paint occur within the first 2.5 seconds of the page starting to load.
The study says: “Looking that the top 20 ranking websites across our dataset, only the first 3 positions are below this threshold. One explanation for the peak from positions 4-9 is that some asset-heavy websites with relatively poor LCP performance are still ranking well.”
The study sites Wikipedia as a good example as it favors a lightweight approach to web design, using mainly text and optimized images. This results in the online encyclopedia scoring low in LCP across many web pages, in the region of 0.4-0.6 seconds for a typical entry.
Total Blocking Time(TBT)
Effectively, TBT measures if anything on the page prevents the user from interacting with that page. For example, a task includes retrieving stylesheets, scripts images, videos. Typically, the larger the asset, the slower the response time and the longer this “task” takes to complete.
The study finds out that there are a lot of asset-heavy websites with long loading tasks that delay user interactivity.
Cumulative Layout Shift (CLS)
CLS measures the “sum total of all individual layout shift scores for every unexpected layout shift that occurs during the entire lifespan of the page.” In simple terms, it’s a measure of how much the elements on your page jump about or shift. This shift creates a negative user experience.
Google offers two benchmarks: < 0.1 = good; < 0.25 needs improvement – everything else is poor.
The study found that CLS had a strong positive correlation (0.2) with higher rankings.
The main causes of layout shifts include media pop-outs such as “subscribe now” boxes, or ads. Websites that do not allow for this dynamic content in their layouts perform poorly in terms of CLS.
Understanding more Web Vitals
Time to Interactive(TTI)
TTI measures how long it takes a page to become fully interactive. Effectively, it measures the time from when the first element starts loading (FCP) to the time when there are no long tasks (50ms or longer) preventing the user from interacting with the page.
“To provide a good user experience, sites should strive to have a Time to Interactive of less than 5 seconds when tested on average mobile hardware.” – Google
A good TTI score implies quick loading times which creates a better user experience, so it is worth understanding how to improve it.
First Contentful Paint(FCP)
FCP measures how long it takes the browser to render the first piece of content after a user navigates to your page.
“Your FCP score is a comparison of your page’s FCP time and FCP times for real websites, based on data from the HTTP Archive. For example, sites performing in the ninety-ninth percentile render FCP in about 1.5 seconds. If your website’s FCP is 1.5 seconds, your FCP score is 99.” - Google
Note that the exact benchmarks for receiving a “good” score can change over time. Currently, to receive a “good” score from Google, websites would need to achieve an FCP of less than 2 seconds.
Speed Index
Speed Index measures how quickly content is visually displayed during page load. Lighthouse first captures a video of the page loading in the browser and computes the visual progression between frames. Lighthouse then uses the Speedline Node.js module to generate the Speed Index score.
Note that the benchmarks are based on HTTP data and can change over time.
The study says that a speed index of around 2 seconds would be a great target to aim for.
The study also focuses on scores that do not directly contribute to the overall score. But they do contribute indirectly. For example, a website with unused JS and CSS will usually lead to larger file sizes and slower load times. Google lists all indirect scores as opportunities in the report.
Will Google penalise those who have a bad experience?
Right now we don’t have the answer. But one thing is for sure: “A good user experience sends good user signals to Google which does boost rankings.”
However, I appreciate how Martin makes a point to stress at the beginning of the Q&A that quality content cannot be overlooked in front of speed. Just like how a good marketing strategy cannot sell a shitty product.
And for those who are still wondering what to do with the Core Web Values update, Martin has some advice:
“First things first, don’t panic. Don’t completely freak out, because as I said it’s a tiebreaker. For some it will be quite substantial, for some it will not be very substantial, so you don’t know which bucket you’ll be in because that depends a lot on context and industry and niche. So I wouldn’t worry too much about it.
I think generally making your website faster for users should be an important goal, and it should not just be like completely ignored. Which is the situation in many companies today that they’re just like “yeah, whatever.””
And have faith in Google’s PageSpeed Insights for accurate data about Core Web Vitals rather than Lighthouse.