JavaScript Rendering & SEO
Search engines treat JavaScript content on a website differently from typical HTML content and will render it separately. As the use of JavaScript on the web increases due to the number of features it makes possible, it is important to understand how search engines view this content and optimize for this. Check out our “Hamburger Analogy” an introduction to client-side vs. server-side JavaScript rendering.
Our SEO Office Hours notes and video clips compiled here cover JavaScript-related SEO best practices from Google, along with notes on the latest advancements to their rendering engine.
Ensure all Key Content is Available if You Are Streaming Content
If a site is streaming content progressively to a page, John would recommend ensuring all key content is available immediately due to the method used to render content. Any additional content which is useful for users but not critical to be indexed can then be streamed progressively.
Googlebot No Longer Needs to Convert Hashbang URLs into Escaped Fragments
Googlebot no longer converts hashbang URLs into escaped fragments as it is able to render and index them directly rather than using the pre-rendered version specified with the escaped fragment. Therefore, John would recommend moving to something that’s URL-based rather than hashtag-based.
Rendering JavaScript Content Will Still Take Longer than HTML Content
Despite the new update to Googlebot, JavaScript still needs to be rendered in a second wave of indexing. When indexing Javascript content, the rendering will still take a little more time but this is typically no longer than a couple of days depending on the site.
Be Cautious Implementing Infinite Scroll on Publishing Sites
John recommends being cautious implementing infinite scroll on publishing sites because having multiple pieces of content in the same HTML might be confusing for Google if they are individual pieces of content.
Viewport is Expanded During Rendering & Likely to Trigger Infinite Scroll Once or Twice
When a page is rendered, Google expands the viewport and then contracts it to try to fit the primary content on the page. Expanding the viewport could trigger one or two infinite scrolls on pages with this functionality, but would not cause Googlebot to keep triggering this and infinitely crawling pages.
URL Inspection Tool Could Change to Include Desktop Rendered Screenshot
John cannot confirm that the URL Inspection tool will be changing to include a screenshot of the desktop rendered version of pages. However, John admitted that this is something that is missing and that he will put it on the Search Console team’s radar.
Google Processing JavaScript on Server-side Rendered Pages Suggests Implementation Error
For server-side rendered pages, Google shouldn’t need to process any of the JavaScript on them as this should have been done by the server. Check your implementation and implement caching to resolve this.
Use Caching for Pre-rendered Content to Prevent Google Decreasing Crawl Frequency
If there is a 5-10 second delay for content to be pre-rendered between Google requesting and receiving a page’s content, this can cause Google to reduce its crawl frequency of the site. John recommends implementing caching to increase response times for Googlebot.
URL Inspection Tool Shows Google’s Real-time Rendering Process
If the ‘View Crawled Page’ feature in the URL Inspection Tool is only showing the static HTML content of a page, this suggests that Google hasn’t rendered the page yet and it is still awaiting the 2nd wave of indexing for rendered content.
Reduce No. JavaScript Files if Googlebot Executing JavaScript on Server-side Rendered Sites
If Googlebot is crawling a lot of JavaScript on a server-side rendered site, then it is worth checking if the JavaScript is referred to on the page and if Googlebot is executing this. John recommends using fewer JavaScript files, caching and rendering the page completely so there aren’t any references to unnecessary JavaScript.