JavaScript Rendering & SEO
Search engines treat JavaScript content on a website differently from typical HTML content and will render it separately. As the use of JavaScript on the web increases due to the number of features it makes possible, it is important to understand how search engines view this content and optimize for this. Check out our “Hamburger Analogy” an introduction to client-side vs. server-side JavaScript rendering.
Our SEO Office Hours notes and video clips compiled here cover JavaScript-related SEO best practices from Google, along with notes on the latest advancements to their rendering engine.
Block Ads From Being Crawled to Avoid Ranking For Unintended Queries
Ads which are inline with the main text of a page can be picked up by Google as part of the content of that page. This could cause the page to rank for queries related to the text in the ad. John recommends blocking the ads from passing pagerank and using JavaScript to block them by robots.txt.
There Will Continue to be a Delay Between Indexing & Rendering Due to Resource Issues
John explained that for the foreseeable future there will continue to be a delay between initial indexing of HTML and rendering, because JavaScript requires resource to be rendered and this can’t happen immediately with the current system.
Avoid Using Google Tag Manager to Implement Critical Tags Like Noindex
John suggests that search engines other than Google may struggle to process GTM tags. Also, because tags are powered by JavaScript, they will only be rendered and applied to a page a few days or weeks after the initial HTML page is indexed.
Sites Using a Lot of Flash or JavaScript Probably Won’t be Moved to Mobile-first Indexing
If a site has a lot of JavaScript or Flash content this could cause Google’s systems to decide that a site isn’t ready to be moved over to mobile-first indexing yet.
News Sites Should Avoid Content That Requires JavaScript to Load
If content isn’t present in the HTML when Google first indexes it and requires JavaScript to load it, then this will have a big impact on news sites which rely on fresh content being in the index, as rendering can happen weeks later.
Having Too Many Pages That Render Slowly Will Impact Google
If a site has millions of pages that take at least a few minutes each to render, then this will significantly impact Google’s ability to render and index the content on these pages.
Dynamic Rendering Can be Used to Show Googlebot Fully Rendered Pages
You can use dynamic rendering to serve Googlebot with pages that are already fully rendered, meaning there won’t be a gap between the initial indexing and rendering.
Lazy-loading Images Can be Implemented Using Noscript Tags and Structured Data
For lazy-loading images, it is important for Google to be able to find the image source tag on the page. This can be implemented using the noscript tag or structured data, so even if Google doesn’t see the images when it renders the page, it knows that they’re associated with the page.
JavaScript Injected Tags Should Not be Duplicated in Static HTML
Using JavaScript to modify canonical or robots meta tags can change the signal provided to Google when they come to process the rendered version of the page. If tags are injected using JavaScript, then John recommends not having them in the static HTML so that the signal provided to Google is clear.
Google Doesn’t Support ES6 as Chrome 41 is Used For Rendering
With modern JavaScript frameworks, it is important to remember that Google doesn’t process ES6 and renders the page using Chrome 41 which supports ES5. This means that Google may not be able to process code generated by ES6.