On-Demand Webinar: What are Google Web Vitals, how will they affect e-commerce stores in the future and how do I optimize my store for them?

SEO Page Performance - Insights and Best Practices

Here's what you'll learn in the webinar:

  • What are Web Vitals?

  • How can I measure them?

  • How to improve page speed effectively?

And if you don't know what TTFB, FMP, FCP, LCP, FID and CLS mean and what it has to do with your website, you better not miss this webinar!

Watch online now (German only):

The speakers

Matthias Hotz is Head Of Technical SEO at diva-e Advertising and was previously team leader and developer at 1&1 Internet AG, responsible for DSL, mobile and hosting stores, among other things. He was a speaker at SEO conferences such as SMX and SEO Day. At diva-e Advertising, he develops SEO tools and supports customers in the analysis and optimization of technical SEO measures. In his spare time, he runs the free SEO tool seorch.de.
Matthias Hotz
Head Of Technical SEO, diva-e Advertising
Christian Paavo Spieker is an entrepreneur and SEO evangelist. In 1995, he founded Concept Advertising in Munich, which he has developed into one of the leading web design agencies. In 2006, One Advertising AG followed part of the diva-e Family since 2017. Since then, Christian has been CEO of diva-e Advertising.
Christian Paavo Spieker
Founder & CEO, diva-e Advertising

Download the presentation (German only):

Transcript of the webinar: How to optimize my e-commerce store according to Google Web Vitals

Welcome & Introduction

Angela Meyer: Welcome to today's webinar. Today, our SEO expert Matts gives tips on how you can optimize your e-commerce store. And if Matts lets him have his say, Paavo may have something to say about it as well. That's your cue. And I look forward to our SEO experts, Matts and Paavo. I will turn the floor over to you guys and hope the viewers enjoy listening—your turn.

Christian Paavo Spieker: Thank you, Angela. Sensational intro as always. Matts, swipe the slide.

Matthias Hotz: I do. Like this.

Christian Paavo Spieker: Yes, briefly about myself. For those who don't know me: Paavo. I'm of Finnish descent, hence this beautiful first name. I'm the founder and CEO of diva-e Advertising. I have been doing SEO since 1997. So I'm quite an old fart. Forty-nine years old, they have two children: Annalena and Timo. I'm on the BVDW expert advisory board. And I have a few hobbies that a man has. There are always quite a few. That's why we've made a point of it now. But everything that has a bit of wheel, downhill. And, of course, SEO has been my immense passion for a long, long time. Ever since I discovered it back in the day. I don't know if it was '97. It's early. But it must have been around '98. It was definitely before Google. And yeah, if it ranks, it's SEO. If it doesn't, it's kind of dumb. That's the fantastic thing about SEO, which I enjoy every day. The results are always visible or assessable relatively quickly. Yeah, I've now moved my Facebook account to LinkedIn. Because of the Russian marriage proposals, they have become a bit too much for me lately.

So, if anyone wants to follow me SEO-wise, please do so on LinkedIn. And there you can find me. And now we have to step on the gas a little bit. Because I was told, at 15:30, I think, NASDAQ trading will open, with stocks. That's when it gets going. And GoToMeeting always goes down a bit. And since Google share somehow over 1500 US dollars-. My God, I can still remember. I think I bought it for 85 dollars, back in 2004. Ah, older men and shares. I now give to my esteemed colleague and probably the best Technical SEO in the country, Matts, to Karlsruhe.

Matthias Hotz: Hello, whether I'm the best or not remains to be seen. I'm the Matthias, Matts. I also stop at, "Hey you!" I'm a front-end developer. But that's right. I can do HTML. I can do CSS. I can also build backends with PHP, NoteShare, Python, etc. I like to develop SEO tools like to program. And know technical SEO pretty well. I was at Pentair before. If you already have it from stocks, it is a Fortune 500. One of giant US corporations. But is more so for B2B. Then I was at 1&1 for a few years. I built the webshops there. I was a team leader. And I was in charge of the stores when you ordered DSL, mobile or hosting contracts, which more or less went over my desk. Hobbies: Yes, I like building software. But I also want to move around on my bike, preferably every day. And I've been with diva-e for advertising for six years now. Then Paavo can tell you something else.

Christian Paavo Spieker: Can I also tell you something? Yes, typically, you're not allowed to advertise in a podcast. That always comes off pretty badly. But I'll do it now. It's not a podcast. It's a webinar. But I'm going to say it anyway. We started with SEO '99. In 2006, we founded One Advertising, which many of you know. In 2015/2017, we won the SEMY award for Best SEO Agency. Then we realized that we need a lot of technology in our agency. And then, without further ado, we joined forces with diva-e. And became diva-e Advertising. We now had many resources in the Adobe, Bloomreach and SAP areas. Spryker is, of course, one of our babies. And then we are BVDW certified. To our customers: We are pretty proud—twenty per cent of the German DAX companies. So, for the excellent head calculators: There are four to five, depending on how you look at it now. Current status. With a slight smile. And Matts, please the next slide.

Matthias Hotz: Wirecard is no longer a customer? (laughs) That was a stupid joke. No, Wirecard was never a customer.

The history of Google algorithm updates

Christian Paavo Spieker: Oh, that's part of it. There's always a bit of shrinkage. Yes, Google updates. Sure, for the SEOs among you who have been around a bit longer: The first big update that I remember was in 2003, the Florida Update. That affected all of us SEOs back then. We were all in the arbitrage business. Google Update was always something terrible. Because somehow, there was always a new algo update or something. But Florida 2003 was hard. Some sites had 60, 70, 80,000 users a day. And after that, they somehow had ten.

Then there was the Austin update shortly after that. That made up for it or caught the pages Florida didn't see. So, the SEOs have always struggled a bit with these updates. Or against these updates. I have been chasing the updates. Many also say that the updates are only there so that the SEOs are busy. Has then turned a bit after Panda, Penguin. Then there was the story with the links in 2009, when Google, of course, quite cleverly swept all the link images off the tablet by virtually banning links. And they gave visual play tray penalties.

But the first, let's say, user update, after Caffeine and Hummingbird and everything like that, was, of course, Mobile-Friendly Update. I think in February, or so they announced, "If you don't have a mobile site, then it's going to be bad with Google rankings." To the agencies, Google started pitching to us back in 2013: Mobile, mobile, mobile. At that time, we had a conversion, a revenue share. So a revenue of 96 per cent was on the desktop. And mobile was somewhat less. Because, of course, the transformation didn't happen there. And Google thought: "Now we're going to make a completely different move. We are now announcing that a Mobile-Friendly Update is coming on 21.4.2015. And if you're not mobile and responsive, then bam." Then everyone started running at our site. Phones were burning hot. We did the update. Nothing happened, of course. But all pages were at least mobile to some extent. There were then several rollouts of this.

There was the Mobile Speed Update 2018, and you could already see that everything was going in the user's direction. And since we also want our users to be happy. And all SEOs should only be guided by this. Because then they've conceptually always done quite well with it. Google has, of course now, after the BERT update and everything else lately, just announced the Page Experience Update for 2021. But now, not with a hard date. But of course, unambiguous statement again: "So, if of course, the experience does not fit, the UX does not fit. If the user has a bad experience on your website, your store or wherever then it's going to be very negative." We've known that for a while. So, all of our tests show that the pages or the stores are why we're talking about stores today because that's where success can always be best measured, of course. Because if there are no more sales or sales drop off if the conversion rate drops off, then, of course, I always notice it. And then, of course, it has a significant impact. And the fact that the user experience, the time on site, the back-to-surface rate, various mechanisms take effect, how Google can measure that. Matts will get to that in a moment. It has a significant impact on detail.

Of course, we have noticed this for a long time. And that's why we are pretty happy that Google is now taking such a step again. And that with the Page Experience Update 2021, a real bang will be set once again. And the first step is the Web Vitals. No, there again, the first step: the Core Web Vitals. And Matts has now prepared lots and lots of great slides on this. And Matts, let's go. Let's get vital.

Introduction to Google Web Vitals

Matthias Hotz: Then let's get vital. So Web Vitals was announced this year. I think most SEOs didn't even notice it that much. It was just another entry in Google Blogs, et cetera. If you read a little bit more into it, you could be a little bit, yes, alarmed. What is it about? Well, Web Vitals is just the headline over the whole thing. So, the idea behind it is more or less that you can make the user experience measurable. Google wants to be able to evaluate the user experience on a website. For years, I've been saying that Google is moving more and more towards understanding how a website works for the user. Are the buttons nice and big? Are they significant to click? Are the graphs pleasant to read? All that kind of stuff. And that's what's happening now.

So, Web Vitals, what I'm going to tell you later is more or less okay. Many will think that this has to do with page speed or performance. But not necessarily. Because page speed and performance can be very complicated rules, they can have many dependencies. There are umpteen measurement metrics umpteen levers that I can adjust. What I'm about to tell you is a few entries for now. Google's focus is clear as day. In the first step, things lead to a real improvement in user experience. What's still to come? We'll see then. It's announced, according to Google, to 2021. Say it will enter the Google algorithm as a ranking factor, most likely. How strong the ranking will remain to be seen. And to start, Google has published the Core Web Vitals. There is also under the URL, which you see there, quasi an entry, where all this is described.

Google Core Web Vitals

And I would like to introduce you to these Core Web Vitals in more detail. User Experience has been a ranking factor several times in the past, or rather, it is a ranking factor. Google noticed that over 50 per cent of their users use smartphones to navigate Google, they introduced Mobile First. Security on the web, where there were many incidents, et cetera, was indicated if a website does not support encryption. The interstitials, which eventually came in droves, actually just annoyed the user. So there was the penalty in the algo. What if, yes, Intrusive Interstitials, they called it, came, there was one on the roof. If malware or phishing is detected on the website, the domain flies out of the index. I notice in Google Search Console that I have to fix that.

And so, as Paavo has already said, various user experience stories were aimed at evaluating not just the content of a website, or the backlinks it has, or whatever. But to assess the user experience. The Web Vitals now extend that. They are, like the factors above, measurable user experience. And then we'll get started.

These are the first step-. Web Vitals is the umbrella term. Google then said, "We're going to start with Core Web Vitals." And that's a subset. So, I'm assuming that the Web Vitals will be expanded in the future. Those are three metrics. They're called LCP, FID, and CLS. I'll explain everything in detail later. But let's summarize: LCP means Largest Contentful Paint. That is, roughly speaking, the loading speed. FID is the First Input Delay. And that describes the interactivity of a website. CLS is the Cumulative Layout Shift. That's the visual stability of a website. All this is already available in Search Console. And you can already optimize it in detail. Or at least monitor that to see if you have a problem there. This is the grace period you have when it becomes a ranking factor next year.

The metrics of the Web Vitals in detail

Largest Contentful Paint (LCP)

Now I'll explain in detail what these metrics are all about. So, the LCP, the Largest Contentful Paint, is the part on the HTML page that is the most significant element in the initial viewport. So the initial viewport is yes, you enter a website. And then it may be very long. You can scroll forever. But that's not so important. It's the most significant element that you will see first. You can also see it in the graphic below: I've highlighted FCP and LCP here. FCP is the First Contentful Paint. And LCP, maybe some of you have heard of page performance stories, and LCP is the Largest Contentful Paint. So, in this case, now this graph that you're seeing. It can be several things. Now the Largest Contentful Paint doesn't have to be a graphic. For example, if the page has a video, a stage, or a slider, it's just that element.

They are not elements that are below the initial viewport. There's a problem because the initial viewport can be different depending on the system. If I have a monster 30-inch monitor, it's bigger than on a smartphone or a 14-inch. But first, understand what it is. Google also very accurately gives out the values that are good, medium or bad. Good is under 2.5 seconds. Medium is 2.5 to 4 seconds. And foul is four seconds. So it can be assumed that if it becomes a ranking factor, all those who have slower than four seconds their Largest Contentful Paint will be disadvantaged somehow.

First Input Delay (FIT)

The next point is the FIT, the First Input Delay. This is the time between a click and the website's response. It doesn't necessarily have to be a click. It's all about interactivity. So, for example, when I start scrolling, there are no delays. When I move the mouse over anything, the hover effect is immediate. When I click somewhere or use a text field, I don't delay. You may have experienced that on websites as well, that when you click and have a mini lag like that, that feels very weird. And very uncomfortable. Or very unnatural.

When a page is visually finished loading, it can still be slow to respond to user input if a large JavaScript is still being passed. So, the browser first takes the HTML, then hopefully the CSS. And then it can already do something. Maybe it already has one or two images. But JavaScript is-. Well, I sometimes see pages in the wild that transfer four to five megabytes of JavaScript. And that has to be adapted, interpreted, downloaded by the browser. And depending on what's in there, that can take quite a while to work. Good is under 100 milliseconds, bad is over 300 milliseconds.

Cumulative Layout Shift (CLS)

Then we have the Cumulative Layout Shift. This is about the stability of a website while it is loading. Before I explained everything, I made a video for this. I'm going to let it play multiple times so you can see it. So, the video is playing now. You can see the Google logo. And then I'll reload the website. That's what I'm doing now. And then you'll see that the logo isn't there yet. And there's a link down there. And then, the logo will load. I'll rerun it. So the Google logo. And then, while the Google logo is loading, all the other content moves down. So I'm with my mouse on this vital link. I'm already planning to click. And then the whole layout jumps down a bit.

That's what you see who uses Twitter, for example, very often on Twitter lately. I'm at a tweet, and then a video is loaded up there, and the tweet is gone. Yeah, it bounced out of my viewport. That's sort of the measurement of the Cumulative Layout Shift. What happens with many websites, unfortunately. It can be different elements as well. So, it doesn't have to be an image. It can be a video element. But it can also be an advertisement or something. This cannot be expressed in milliseconds, but Google has defined a value. Good is below 0.1, and wrong is above 0.2.

What else matters besides the Core Web Vitals

These are the three Core Web Vitals that have now been initially released. That's how they differentiate themselves from Page Speed now. Because a few could be accommodated with it. Core Web Vitals complement Page Speed and vice versa. So, you still have to take care that you have a fast website. But I can manipulate the Page Speed. Paavo once wanted to. A few years ago, he decided: The advertising.de at that time must have a page speed of 100. But that was not possible with the content. But if you knew the parameters, you could hammer out a page speed of 100 even if it was not so fast. Page Speed can be manipulated to some degree if you know where to turn. LCP, FIT, and CLS, as you just saw, is almost impossible to use. So, that's where I need to do something to improve it.

Nevertheless, of course, one should not rely purely on these Core Web Vitals. Namely, if the Time-To-First-Byte or the Start Render of my page is very slow, the Largest Contentful Paint can't be very fast either or will be just as dead. The delineation now is more or less than Web Vitals are more or less explicitly concerned with user experience. In contrast, improving page speed doesn't necessarily lead to a better user experience. Because if the Largest Contentful Paint at the top is reloaded asynchronously if necessary or via lazy loading, it's just still not there, even though the page, from a purely technical point of view, loads very quickly. And also maybe gets a very high Page Speed Score. So.

Page Speed and Web Vitals

First of all: Page Speed and Web Vitals. You can see that Google is putting quite a bit of pressure on the tube. The Web Vitals are now built into Chrome Lighthouse. They are made into WebPageTest. They're showing up in Search Console. Page Speed Insights gives them out. There is a Chrome Extension for Web Vitals developed by Google. And Custom JS was published on the WebDev so that URLs by Google as an example can be integrated directly. But I'll show all of that in a moment. And they hammered all of that out in three months. So, they built that into Search Console in three months. They made it into all the page speed tools under their wing and looked after. And built it into Google Chrome. So it must be a little bit important, was my thought then.

Impact of Core Web Vitals on E-commerce

So, now we're going to get to the impact on e-commerce first. Before I explain to you how you can influence the Web Vitals at the end. So, Google is changing its algorithms fluidly now. In 2020 alone, there were five confirmed core updates, probably more. The SEOs have there, so trackers then see when many rankings switch or change. That one assumes that Google then drives an update. And that happens every week by now. So, I ignore that by now. Because it's not a "We update once a year or twice a year." Like it was five years ago. But it's rolling. They're doing what feels like daily updates.

The next point: Google rarely communicates which parameters were changed in the rating. So, they shared it back then with Mobile Gaddern, as it was called. With mobile-first, they spoke it. Those were huge updates. And they've communicated it now with Web Vitals as well. I am saying that they're giving a specific timeline. When Google makes updates, they don't do it because they want to annoy SEOs. Or because they want to annoy website owners. But they do it to improve the user experience. User experience is very much. So, Google just wants to deliver the best result for a search query. And now, of course, you can ask, "What is the best result?" Sure, the best result primarily answers the user's problem. And it does so satisfactorily. Everyone knows that everyone knows that. Content trallala hopsasa and so on.

Reading the Quality Rater Guidelines, other facets are just expertise, relevance, and a fast mobile website. That leads to interaction. The social spread may be necessary, et cetera. These are all things that play a role. The best result now in the future also delivers a better user experience than it does now. It means now when I run a store, even if I have done all known SEO homework: My technical foundation is proper. My content is correct. My internet linking is spiffy. And I have some great backlinks, too. Is it entirely possible that I can lose in the surrounding rankings? If the core web vitals indicate poor user experience.

And I have to make it clear: I can't just say, "My home page has to be just right." It's really about every URL. Accordingly, here are a few examples that I guarantee you all already know. This time, it's not the infamous Amazon study from 2002. But this is from Adobe and Medium. You understand, and we're telling this-every e-commerce Hoschi out there is telling this. Hoshi wasn't being mean now.

So, 36 per cent of users abandon a visit to a website if it is unattractive or too complex. Forty per cent switch to a competitor if the site is too slow. If the images don't load, 39 per cent leave the site. Far too much content: 38 per cent left. Poor user interface: 35 per cent leave the site. There's an extensive Adobe study, a considerable PDF that I rolled through there. We know that. It's nothing new. Why do we know it? Because we're like that ourselves.

Christian Paavo Spieker: Mattis, how was the Amazon study back then? I think it was 100 milliseconds, or 0.1 seconds, which equals one per cent less conversion, or sales in the online store (Matthias Hotz: Exactly.) somehow. That was the thing that made everyone sit up and take notice, where everyone thought, "Oh God, Amazon! What bullshit." But we also had that in facts and figures more often. We have many e-commerce customers. And it's really like that. If we ever had an error on the store or something just slowed down, then it cost massive sales. That's why it has to be taken very seriously, of course.

Matthias Hotz: Yes, many people know that. But then many ignore it somehow, especially when they relaunch or redesign. Exactly, but still. So I just took a look at the - not the first, well, a little bit the first row. And have with the Chrome plug-in from Google than simply here times Cyberport: They fail there wholly. OTTO falls at least with the Largest Contentful Paint. The Müller website: terrible. Ten seconds until the Largest Contentful Paint was at a category page. Notebooksbilliger, amazingly enough, totally hits the mark. So, they've got it down pat. Real: Largest Contentful Paint on the home page was 11.6 seconds. Zalando also got it right. So speak, it fits there. That was a category page. As I said, I looked at individual pages now. But that is still meaningful. With Obi, there it becomes scarce. And at Lidl, there's also a fail. So the first row of e-commerce, at least as far as German manufacturers or German suppliers are concerned, still has much room for improvement. And accordingly, I would now like to come to the topic of how I optimize for this.

How to optimize my e-commerce store according to Google Web Vitals

Analysis of the status quo

I first have to take a look at the status quo. So, as I said before, it's not enough to look at two or three URLs. It's also not enough to look at every page type once. Because it can also be images and texts. So, I don't use the same pictures in every category. They can have different dimensions different sizes. Accordingly, you have to create a comprehensive test coverage. And I can only do that if I include as many URLs as possible in the monitoring.

The best way is RUM or Real User Measurement. I'll explain that in a moment. The second best way is an API connection to Page Speed Insights, an example from a Lighthouse or WebPageTest. You can also take the Chrome user experience data. So, that is now called in the following: CRUX. Chrome transmits metrics to Google if I agree to it during the installation. And that includes the user experience. What is not so good is periodic testing via Web Tools.

Because that is just a snapshot, the user experience can change daily. RUM, Real User Measurement, is the coolest thing because I measure against my real users. There's JavaScript for that that I can integrate into the website. And then those metrics are recorded. So the Largest Contentful Paint, the Cumulative Layout Shift, and the First Input Delay I can sort of pick up with JavaScript. And then can process it. I can report that to Analytics. I can build my reporting for it. I can simply write it into a database, etc. It's just interesting that it's measured with real users. And I also know, because I can access further data, such as the user agent or the network, that I know: "I have no problem at all on desktop. But on mobile, I have to do something." The next horse's foot: because these metrics can, of course, differ depending on the end device. As an example, the problem is that a mid-range smartphone doesn't have as strong a processor as my desktop PC that's sitting at home. And accordingly, it takes longer to fit the JavaScript. Yes, just keep that in mind.

The important thing is that every page visited is then tracked via JavaScript and reported. And I can then read out where I have problematic pages or complex layouts. These are not laboratory values. The sample code is available on Google's pages. You integrate that more or less like a tracking port. And here's the thing, Google scores based on Real User Measurement because they have the Chrome data. So, Google knows afterwards whether 80, 90 per cent of the users have a good user experience or not. They don't even come by with the bot and check it, located in a fat data centre with enormous bandwidth. But they use the Chrome data. And accordingly, they know very well how the website performs. You can check it out at crux. Run. Someone built a front-end where you can access the user experience data. So, this Chrome user experience data is openly available in a Google database on the web. If you are not a database god, it is difficult to write a query. But via crux.run, you can just play around and see if your site is included.

Christian Paavo Spieker: Matts, how do you compete with the js-360 or Google Analytics figures? There you also have an actual user measurement in it. Is it then-? Do you mean the same thing, right?

Matthias Hotz: No, it's not the same. Because the Google Analytics data is determined via the Google Pixel, i.e. via the Analytics Pixel, it does not have to be aligned with the Crux data. Because the Analytics Pixel may be loaded later in the source code. Or is not yet available. And Chrome is responsible for rendering the website from front to back. And accordingly has the most accurate data. Therefore, these numbers are precious. And I also assume that's why Google provides them for free. The second link goes to the web. Dev, you can see this JavaScript that you can put in there. So, of course, this is complex. You are putting JavaScript into an existing website and setting up your database, checking that, and so on. But of course, there are APIs for that. The disadvantage of an API is that it's first a synthetic measurement that doesn't always quite correspond to natural environments. So, I have laboratory values. But it is straightforward to do.

The Pagespeed API from Google is free of charge. So, you have to make sure that it is the Pagespeed V5. The WebPageTest API is accessible to a certain extent. And there, you can just get the data. A JSON comes back, and then I do something with it. And you can also use this CRUX data. You can also retrieve it via API. You can enrich the data with it. So you have a comprehensive spectrum. And above all, you can perhaps also test pages that are not so frequently called up, which are not reported in the Search Console, because maybe they don't get any clicks. Or don't get any impressions. We have recently tapped into the entire Lighthouse API for this purpose. We have set up our Chrome instances where Lighthouse is instantiated. And we then pick up the data.

If you're a developer, there's a lot you can do in Chrome. So, I can slow down the network shortens the network. I can simulate that I'm using an iPhone. And all that kind of stuff. Then get polling lab values, but effectively get many data out quickly. And can just run a test as well. This data is usually sufficient to start than optimize. How do I optimize now?

Make problems clear

So, once I've collected the data, I need to understand where I'm having problems. The page performance basics, as I mentioned above, have to fit. So, time-to-first-byte. If a website has 280 requests and transfers five megabytes of data, I don't need to start with the Web Vitals. Because then the user experience isn't that cool anyway. The important thing is to focus on the initial viewport. I marked it a bit there. So, the graphic is a bit bad. But I hope you can see it. The initial viewport is at the top, just where this stage-the mountain biker is supposed to be. And at the bottom, everything is kind of reloaded. So nothing is finished at the bottom, while the initial viewport at the top is already finished. That's good. That just means the user can get up there-. It's just a split second. So, the user can already find his way around at the top. And at the bottom, I slowly start reloading while the user already sees the initial. That's why in the future, you have to approach page performance optimization and web vitals or usability optimization a bit differently than you might be used to. And above all, you have to look at the three Web Vitals - Largest Contentful Paint, Cumulative Layout Shift and First Input Delay - individually. So, they can't be optimized together.

Optimize Largest Contentful Paint

Let's start with the Largest Contentful Paint. I said above or earlier: these mainly block elements. So, in HTML, there are two types of features: Inline, which is Span, for example. Or Ahref is an inline element. Inline means it flows. So, say I can do three Span elements or three R elements in a row. They are in one file. If I have an HTML block element, like a div or a P, a new line is created. That's what it's all about. It can be background images or images, videos. Anything that takes up the most space in the initial viewport. So, as I said, it doesn't have to be an image. It doesn't have to be a video. It can also be text.

Then figure out which element that is. Of course, this can vary from page to page. It can also be the logo. Or even the video that comes in there. And then find out what slows down the loading of the element. I've made a waterfall diagram here on the right. Up here, so I hope you can see it: I'm loading a page. Then five fonts are loaded first. Then about 200,000 CSS files are loaded. And then JavaScript is added to that. The question now is: How much of that do I need for the initial viewport? Probably only a fraction. The font is nice, but it can load later because I might fall back on an installed font. I see a polyfill and tracking all these CSS and JavaScript files. I see a video CSS up there, a colour box CSS. All of that is needed for this initial viewport, not required. Accordingly, from a usability point of view and a web vitals point of view, I just have to ask myself, "What is that doing there?"

Important now would be that up there, of course, the HTML file. But that is loaded first anyway. Then maybe the logo of the store. The stage, the image, the slider, whatsoever. That has to be loaded first because that's what makes the Largest Contentful Paint fast afterwards—third-party stuff, scripts, et cetera, but at the very end. I wonder why people think it's a good idea to load their Analytics Pixel in third or fourth position. Or their Google Tag Manager. Because whether I lose a tracking impression or whether the user has a bad experience. So, my decision would be pretty straightforward there.

If you figure that out, or if you now suddenly see 25 CSS files here, all of which have nothing to do with the page at all, then you have to change the loading order. This is something straightforward. You go into the HTML and say: I will load this CSS first, then the JavaScript. And I'll load the rest at a different time. For example, at the end. I'll stop including stuff from CDNs. What we often see is, for instance, that Google Fonts is used. It's an extra request. And that kind of slows the whole thing down. So, I have to fetch the fonts. And then the font has to switch. And these are all things that I don't need there. And basically, all you have to do is change the loading order. I don't have to make my page that much narrower. I don't need to slim down or anything. I just need to load the things that are up there first.

Optimization of the First Input Delay

With the First Input Delay, it is a bit different. JavaScript is always responsible for it. So, if there is no JavaScript on the page, there will be no First Input Delay. Or very rarely, there will be one because it's loading through. And if it's there, it's there. And then I can operate it. But if there's a lot of JavaScript loading, it can lead to user input blocking. Now the question is: How much JavaScript is a lot? One megabyte of JavaScript is 500 book pages if I print it out. Nobody needs that much JavaScript. Well, not unless I'm using a rich application, like Facebook or-, where much stuff needs to happen. But yes, developers like to throw a lot of JavaScript at web pages to solve somehow trivial problems that you could code yourself. Accordingly, you first have to see how much JavaScript is loaded onto the website. Just the house number: Under one MB. And preferably under 200 kilobytes, if compressed, would be good. Alternatively, if I can't do without JavaScript, I have to ask myself: do I need JavaScript in the initial viewport above? If not, then I load it at the very end. Then I load it after the images, the CSS, graphics, everything is loaded through. Then I can load the JavaScript.

Should I need JavaScript up there because maybe there's a slider or something? Then I can remove this small part. You can program such a slider with five to ten kilobytes. I can then load it up there. And five to ten kilobytes, the browser doesn't need very long. But if it has to interpret 500 pages of a book every time... Time goes by. Then we also often see that JavaScript, which is not needed on the page, is loaded. JavaScript is responsible for the checkout, which is responsible for a Google Maps integration, but where there are no Maps on the page simply because it's much easier for the developer to load everything all the time. Because then it's available. You don't have to worry about it. If I then want to use Maps there, it's there. But it leads to an increased payload, and slower web pages are not always entirely in mind.

Ideally, as I have already said, the JavaScript should be divided into a maximum of two to three files. In the past, people used to say that the fewer JavaScript files, the better, as a page speed performance consideration. This is not wrong. But if I need JavaScript urgently for that initial viewport up there, I have to split it up. I take this little snippet for the very top. Load that early. And the rest I load at the end. I can also, if it's difficult, pack with deferring. You can now give instructions to the script tag as an attribute. There's one, for example, delay. That tells the browser, "Hey." So, we assume the JavaScript is very high up, but it has deferred. That the browser then sees that tag, that JavaScript also sees. But it then sees: defer. And delay then sort of says, "Hmm, it's not that important. Load this when you have time. Interpret it when your load-so, when the thread is free, so, when the CPU is less loaded. And by doing that, I'm also preventing a bad first input delay.

Cumulative Layout Shift Optimization

Then we come to the last one, the Cumulative Layout Shift. I demonstrated it to you in the video. This is the twitching of the website. Anything can be responsible for that. So, images, IFrames, Youtube videos are integrated. Adds, but also a font swap is part of it. You have to find out what is responsible for the twitching. And then tackle it.

Why does the twitching occur? That might be interesting, too. Well, the browser is presented with an HTML page just like you read a book page. You start reading at the top of the first line. And then you read to the bottom. And then you come to a place where an image is referenced, in this case. In the book, there would be a picture there now. But the browser only now sees: "Hm, here is a picture. I have to load that." I can initially give this image a dimension. Via CSS, for example, I can say: "This image, which is loaded there, has 400 x 600 pixels". Then the browser reserves the space for it and loads the image there.

However, if I do not give an initial dimension, what you can do is wrong. Then the browser doesn't know how big it is and doesn't reserve this space. And when the image is loaded, it needs space. And then it starts to twitch. And it takes that space. One approach to fix that is to initially give images, IFrames, and all the embedded elements a size specification. You know how big the YouTube video is. You know how big the AdSense integration banner is. And you can tell this element that encloses it via CSS Dial: Width is equal to 768, Height is equal to 90 pixels. And that solves the problem.

Unless it's about fonts, for example, with fonts, unfortunately, it's often like that, we often see that-. Of course, you want to use your custom font. But then it is loaded five times. Once in very, very bold, then in italic, then in ordinary and then in a light variant. And unfortunately, such fonts are not small. Accordingly, you first have to check how many fonts you have. How big is the font file? Is it adequately zipped, etc.? And if you can't get around your five fonts, you can make the font switch optional. That means, if one has a fast connection and the fonts are loaded quickly, he sees the fonts immediately. The other falls back to the system font that you have defined in the CSS. And would then see them from the next page. But at least I don't have this cumulative layout shift. I have a stable user experience.

So, I think that's it. Ha! Yay! Feel free to ask questions, feel free to discuss, et cetera. It was very compact now. Important: Watch the topic. Also, watch how Google will play this next year. Follow it on some SEO blogs because it may have a pretty significant impact. And that this initiative will just continue to expand. And take a look at your domain file in Search Console. So, that's it.

Survey: Use of Google Web Vitals among participants

Angela Meyer: Thank you very much, Matts and Paavo, for your insights. And now we're going to start with the Q&A session. Feel free to ask your questions via the control panel at the GoToWebinar now. We still have time now. Right. According to Google Web Vitals, I would also start a short survey for the attendees if they are already optimizing their e-commerce store. I'm going to start the survey here. Give you guys a few seconds. And then you guys will also have time to write down your questions. So, our participants-.

Christian Paavo Spieker: Then I'm curious to see what comes out.

Angela Meyer: Yes, well, we already have-. About 70 per cent of the participants are dealing with Google Web Vitals. But also around 20 per cent are not focusing on it yet.

Christian Paavo Spieker: You also have to say at this point, Google, of course, says that it will somehow flow into the algo in 2021 or whenever. We don't know to which component it is already in today. Google doesn't say that, of course. Because we already think that it's Page Speed and everything that plays a role, now the Core Web Vitals, it already has some relevance today. Because that's a logical conclusion if the user experience is good. Mattis has already said that Google can directly measure all the data via the Google Chrome Bar. Via the Google Browser and various extensions and add-ons. And that's why it's very, very important-. Yes, well, we always call it Need for Speed. So, there is no such thing as too fast, as we all know.

Q&A

Angela Meyer: Thank you very much to the participants for your answers. Now the first questions came in.

Which Chrome plug-in did you use when you just looked at the pages of Cyberport, Notebooks, et cetera?

Matthias Hotz: Well, I don't know the name. But there was only one. It's linked to the web. Dev. And if you look in the Chrome App Store and search for Web Vitals, it should be the first hit. It's also a straightforward plug-in. It only shows the three values. And then it has a red and a green or a yellow indicator at the top.

Chrome seems to offer an Experience feature in the DevTools performance report since version 84, which can analyze layout shift issues. Do you have any experience with this?

Matthias Hotz: Yes, that happens more often. Well, I don't know. I don't use the Lighthouse Report that much within Chrome. I use the Coverage Report sometimes. But that's precisely the data that probably goes into the CRUX afterwards. Because why would Chrome collect the data twice, et cetera. So, I think these Chrome Developer Tools and these tools that say something about the health status of my website are excellent. And I recommend using those. So, always with a little bit of scepticism.

I always have to pull that tooth. If you're sitting in a company and you somehow have a 100-, or a gigabit line or a 100Mbit line, that's not real. Because most people don't have that line at home, they might have 16 Mbit or maybe 20 or 30 Mbit. Or if I'm on the road with LTE or 3G, I don't have that line either. That's why I always recommend either to throttle in Chrome, which you can set, for example, if you select another device. Or to take the test via WebPageTest from another place than from his private Chrome on the strong computer in the company.

Angela Meyer: Okay. Then I would like to ask you about the Adobe study. You had discussed that 36 per cent of users abort a visit to a website, et cetera.

Where can I find the Adobe study? Do you have a link?

Matthias Hotz: I would have to look it up again. It was still on the Adobe page, but otherwise, I would search for an Adobe user experience study. I think it was from 2015. Well, it was one of the largest, although it's been a few years now, it was representative. And above all, it was conducted by a company that knows a bit about this area.

Christian Paavo Spieker: Everything is always going in the direction of speed. And of course, we have had many relaunches, somehow on store systems. Now you also have to say that the Web Vitals, or just the loading sequences with the JavaScript files, is, of course, very bad with standard systems, such as an AEM, i.e. Adobe Experience Manager, or a Hybris, or Magento in particular.

First, he unloads a whole landfill at the front. Not relatively so easy to influence. But we often noticed that if we had more speed, it had a significant effect on sales or the conversion rate. That has to be said quite clearly. Even if we always thought, in the agency: The store is fast, also from the measured values. That's precisely the crux of the matter in the end. If you count on a user basis, from the user, there are many, especially in the mobile area, that is relatively slow. And that costs conversion. And that's why all studies always say the same thing.

Yes, I always say: "If it's slow, it's stupid, yes." Costs sales, yes. And as I said, I think, especially now with mobile: If you look at the mobile figures. I said earlier that our mobile share in e-commerce was 96 per cent. Today, of course, it's-. So, mobile is always higher than desktop. So, in other words, speed plays a very, very big role, yes. So, there is simply no such thing as too fast, yes. And whether that is Cloudflare with a CDN or what you do there, if you are a bit more international on the road. Or still to the image optimization. Of course, that can help immensely, very well.

Matthias Hotz: Yes, but Cloudflare can't help if your CLS or FID don't work. So, it's no good there. Google has noticed that. That's right. Cloudflare is a sound system, and it has its raison d'être. But it doesn't mean that my user experience will be better because it has other factors. And that-.

Christian Paavo Spieker: It's not getting any worse.

Matthias Hotz: No, but it's still-. That's what makes these Web Vitals interesting because this time, it's not about the naked speed, but really about the perceived speed.

Angela Meyer: You are welcome to ask further questions now. Or also afterwards to our experts. Matts, you can click on, please. Nothing came in just now. Then I will draw your attention to our other webinars that we're hosting every week. And next week, it's all about SEA. Specifically, the topic: Feed structure as the basis for creative excellence and user-oriented ad placements in SEA. With our SEA experts Christian Rainer and Dominik Langner from the Paavo team. We're looking forward to that, too. Register now. You can find all the information in the diva-e newsroom. And you may click on it again. So thank you both for your time and your insights. And a thank you to the viewers who listened to us today. I hope you guys have a great day. And until next time.

Christian Paavo Spieker: Thank you.

Matthias Hotz: Ciao.