JavaScript Rendering & SEO
Search engines treat JavaScript content on a website differently from typical HTML content and will render it separately. As the use of JavaScript on the web increases due to the number of features it makes possible, it is important to understand how search engines view this content and optimize for this. Check out our “Hamburger Analogy” an introduction to client-side vs. server-side JavaScript rendering.
Our SEO Office Hours notes and video clips compiled here cover JavaScript-related SEO best practices from Google, along with notes on the latest advancements to their rendering engine.
Injecting Markup Using JavaScript Isn’t Recommended
Avoid injecting markup on a page using JavaScript as this makes it more difficult to debug and test that the schema is working correctly. Include markup in the source code where possible.
Malware Can be Served via Your Site if You Serve External Content via JavaScript
If you serve content from 3rd parties via JavaScript then be aware that if the websites hosting the content get hacked or edit their JS code to serve malware, then Google will flag your site as serving malware.
Combine Separate CSS, JS & Tracking URLs to Increase Googlebot Requests to Your Server
To improve site speed and allow Googlebot to send requests to your server more frequently, reduce the number of external URLs that need to be loaded. For example, combine CSS files into one URL, or as few URLs as possible.
Fetch & Render Tool in GSC Doesn’t Reflect Real Rendering
Getting ‘temporarily unreachable’ messages in the Fetch & Render tool doesn’t reflect how Google is rendering content for its index. Google’s rendering service has a longer cutoff time and uses caching.
Scroll Events Shouldn’t be Used in Isolation to Execute Lazy-loading
Scroll events aren’t always the best solution because they are expensive, users on desktop may resize their window to get more content which wouldn’t trigger a scroll event, and Google doesn’t scroll. Test lazy-loading is working by using Fetch & Render and Intersection Observer.
Google Can Process JavaScript Redirects as Long as it Can Crawl Them
JavaScript redirects don’t usually cause any problems for Google as long as it can crawl them, and they are treated as regular redirects. Make sure these redirects aren’t disallowed, however, as Google won’t be able to process them.
Client-side Rendering Doesn’t Work for Facebook & Twitters’ Crawlers
Be mindful that other crawlers, specifically the ones used by Facebook and Twitter, don’t support client-side rendering. Any Open Graph Tags or Twitter Cards implemented with JavaScript need to be server-side rendered or dynamically rendered.
Large Sites with Frequently Changing Content Should Use Dynamic Rendering Rather Than Client-side
It is recommended for websites with fast-changing content and large websites to implement dynamic rendering rather than client-side rendering. Client-side rendering can cause a delay in indexing and can also UX issues, especially on mobile.
Google Can Discover URLs for Crawling if They Are Included in Full as Links within JavaScript
JavaScript links aren’t the same as HTML links, but if you include a full URL within a JavaScript link then Google will try to follow it.
Google Won’t Fetch Third-party Scripts that Aren’t Deemed to be Useful
Google is getting better at recognising third-party scripts that aren’t useful for it to fetch and render, so will avoid fetching those where it can.