Done well, search engine optimization (SEO) requires different levels of care and feeding – not just keywords, but attention to technical SEO trends on the horizon. Here are three trends we have on our radar that we think will dominate in 2021. It’s not too soon to be thinking about next year. As you probably know, the groundwork must be laid early for a successful SEO strategy.
Page experience – Core Metrics
At the end of May, Google announced the new page experience update that will go live in 2021. We know Google is constantly making algorithm updates but with this much lead time, it’s safe to assume it is going to be impactful.
User experience is already a part of Google’s algorithm, which includes factors such as site security (https), page speed, intrusive interstitials, mobile friendly, and safe browsing. This update includes ranking factors that were already a part of page speed insights, but those factors will now be further refined and weighed separately with more importance.
These new ranking factors are referred to as Core Web Vitals and can be measured in a new report under Google search console. These include:
– First Input Delay (FID) – FID measures when someone can first interact with the page. To ensure a good user experience, the page should have an FID of less than 100 ms.
– Largest Contentful Paint (LCP) – Not to be confused with first contentful paint, LCP measures loading performance of the largest contentful element on screen. This should happen within 2.5 seconds to provide a good user experience.
– Cumulative Layout Shift (CLS) – This is a new factor added to page experience and it measures the visual stability of elements on screen. Sites should strive for their pages to maintain a CLS of less than .1 seconds.
JavaScript SEO
In 2020, most eCommerce websites will use JavaScript and this trend is expected to gain even more traction in the future. JavaScript is the lynchpin in ensuring that your website’s content is dynamic and interactive. Think about those product recommendations that are conveniently displayed at the bottom of a page or your product listing pages that lazy load all items to a view all page.
Now, imagine if Google was not able to properly index that content. It would have a significant impact on how those pages rank, wouldn’t it? YES. It absolutely will.
Check if your JavaScript can be rendered and indexed
The first step to ensuring your JavaScript can be properly crawled and indexed is to check how Google is rendering the page. You can test this by using the Live Test URL tool within Google search console or the Rich Results Tester that conveniently lets you see rendered page code for both Googlebot Smartphone and Desktop. If you see that the rendered page looks nothing like the live page on your site, it’s likely your JavaScript resources are not loading properly. You can also check the “more info” tab to see which resources were not loaded and review rendered code.
Optimizing JavaScript Resources
If you determine your JavaScript resources are not being rendered and indexed properly, the next step is to determine what the best solution is for your website.
Client-side rendering is one of the most common pitfalls with JavaScript. While this can be fixed by switching to server-side rendering or dynamic rendering there might be an easier solution to deploy before taking on such a large project.
You should first try to diagnose which code is causing the rendering issues by looking at which sections of your page do not load properly. Once you’ve done that, check the following:
- Avoid any errors in JavaScript – This will cause JavaScript to not load any content on your site for crawlers or users, so it should be obvious when this happens.
- Do not block JS within robots.txt – While this is a very outdated practice, some CMS’s still have this as a default. JavaScript should never be blocked and should always be accessible by search engine crawlers.
- Avoid # in URLs – Googlebot ignores #, so URLs that use it will not be read properly and could prevent Google from getting to the content.
- Make sure hidden content can be read – If you are using JavaScript to display hidden content, Google may not be able to crawl this content unless it has a specific URL associated with it.
For further explanation on potential solutions, check out Google’s guide on JavaScript Best Practices.
Social Sites do not process JavaScript
Social media sites such as Twitter and Facebook do not process JavaScript. Therefore, OG tags and Twitter cards must be implemented in html. Failure to do this will prevent posts from showing up in the recommended format.
Schema – Structured Markup
Schema is not a new concept, but it is constantly evolving. Schema helps define the context of a query and simplify page content so search engines can easily understand what that page is about. Once added to a webpage, it will also create rich snippets in search – these are essentially enhanced listings that are more attractive to a user.
Rich snippets can include elements such as pricing information, review ratings, product availability, images, and videos that give detailed information about the content of your page.
This can be especially important for eCommerce sites that sell similar products to their competitors. When looking at SERP results, the first few listings will obviously get the most traffic. If title tags and descriptions are essentially the same, schema markup can help set you apart as the more attractive choice.
What types of schema should I put on my eCommerce website?
Our general philosophy is to add schema markup to anything possible. Giving search engines context around your site content can only make it easier for them to understand and rank your content.
With that said, these are some of the most common types of schema we add to our eCommerce sites:
- Organization
- Website
- Product
- Review
- Article
- Blog Posting
- FAQ
- Video
It’s also important to check the validity of the schema after its implemented. Warnings aren’t great, but typically will not prevent search engines from validating it. If errors exist, the file cannot be read and therefore will not be validated.
There are many technical elements to pay attention to in the evolving world of SEO – we feel these three are going to be the most important to have a well optimized site in 2021.