Stay up to date with the latest best practices in SEO with key takeaways from Google Search Central’s SEO Office Hours sessions.
We know it’s hard to keep up with the many often-changing facets of search engine optimization — that’s why Deepcrawl’s team regularly attends these sessions and shares our notes—both in our recap blog series and in our full SEO Office Hours Library, where you can find tips and video clips for just about any technical SEO topic you can think of.
Are web components and Javascript-only content bad for SEO? (Testing is key!)
One user asked whether web components are bad from an SEO perspective. Most web components are implemented in Javascript frameworks, and Google can process most forms of Javascript. John also mentions later in the video that sites that aren’t user-friendly if their JS were to be switched off typically aren’t a problem for Googlebot (as long as the relevant links and content are also available within the source code). However, John would always recommend testing a sample of pages using the URL Inspect tool before assuming your chosen JS frameworks are supported.
Skip directly to this part of the session video below:
Get more key Office Hours takeaways on JavaScript rendering issues.
Spammy backlinks to 404 pages are ignored by Google by default
When asked how to deal with thousands of spammy backlinks, John was keen to reassure users that low-quality sites linking to 404 pages won’t impact your site negatively. Because it’s a 404, Google essentially reads that link as not connecting to anything and ignores it (for the same reason, it’s important to review/redirect links coming from valuable sources to a 404 page). If it’s just a handful of sites providing spammy backlinks to non-404 pages, the recommendation is to set a domain-level disavow.
Skip directly to this part of the session video below:
Learn more about how to disavow spammy backlinks.
You can ask Google not to translate your pages with a “notranslate” meta tag
Google’s translation feature aims to make your content accessible to a larger group of users, but there may be scenarios where you’d prefer pages not to be translated. It’s possible to prevent titles and snippets from being translated in search engine results pages with the “notranslate” meta tag. This signals to Google that a translation for that page is not required and will also stop users from being shown the ‘Translate this page’ prompt when they open a URL.
Learn more about website internationalization and SEO.
Mixed-language pages can be confusing to Google
In general, Google tries to use the primary content of a page to determine which language a page is targeting. However, it’s also recommended to make title tags and headings match the page’s primary language. Having various elements on one page in different languages makes it hard for Google to know how the page should appear in the index.
Learn more about multiple site languages and SEO.
Legal and age-verification interstitials can affect crawling
Interstitials (for example, those that require users to verify their age before browsing) can have a negative impact on crawling and indexing if not implemented properly. Googlebot doesn’t click buttons or fill in forms, so any interstitial that requires those actions before the page is loaded may prevent Googlebot from crawling the content itself. Ideally, it’s recommended to use JS/CSS to display interstitials on top of existing content that’s already been loaded in. Because the content is still being loaded and users can access it after navigating the interstitial, this would not be seen as cloaking.
Learn more about interstitial pop-ups and SEO.
Technical issues don’t tend to trigger core update drops
One user was concerned about running into new technical issues just before a Google core algorithm update was launched. If you’re seeing a negative impact after a core update, it’s usually due to data from a longer period of time and not just a reflection of new technical issues in place at the moment of the core update. It’s also worth remembering that technical issues usually don’t tend to fall into the same category as the quality issues core updates focus on. Running into new technical issues just as a core update is released doesn’t mean that your site will definitely be negatively impacted.
Read more SEO Office Hours takeaways on Google algorithm updates.
CWV fluctuations are common, even if you’ve not made changes to the page
One user notes seeing a lot of fluctuation in their Core Web Vitals (CWV) metrics, despite not making any changes to the pages themselves. Core Web Vitals metrics depend a lot on field data, which is collected from real users accessing your site. Users browsing from different locations and with different device/connection types can be the cause of these fluctuations, as each experience is unique.
Learn more in this Ultimate Guide to Core Web Vitals.