Deepcrawl is now Lumar. Read more.
DeepcrawlはLumarになりました。 詳細はこちら

JavaScript Rendering & SEO

Search engines treat JavaScript content on a website differently from typical HTML content and will render it separately. As the use of JavaScript on the web increases due to the number of features it makes possible, it is important to understand how search engines view this content and optimize for this. Check out our “Hamburger Analogy” an introduction to client-side vs. server-side JavaScript rendering.

Our SEO Office Hours notes and video clips compiled here cover JavaScript-related SEO best practices from Google, along with notes on the latest advancements to their rendering engine.

What is the difference between JavaScript and HTTP redirects?

John explained that, in general, Google strongly prefers server-side redirects (301 or 302 redirects, for example) to JavaScript redirects. 

If you use JavaScript to generate the redirect, Google first has to render the Javascript and see what it does, and then see the redirect and follow it. If you can’t do a server-side redirect, you can still use JavaScript, but it just takes longer for Google to process. Using a meta-refresh redirect is another option but again, this will take longer as it needs to be figured out by Google.

22 Jun 2022

APIs & Crawl Budget: Don’t block API requests if they load important content

An attendee asked whether a website should disallow subdomains that are sending API requests, as they seemed to be taking up a lot of crawl budget. They also asked how API endpoints are discovered or used by Google.

John first clarified that API endpoints are normally used by JavaScript on a website. When Google renders the page, it will try to load the content served by the API and use it for rendering the page. It might be hard for Google to cache the API results, depending on your API and JavaScript set-up — which means Google may crawl a lot of the API requests to get a rendered version of your page for indexing. 

You could help avoid crawl budget issues here by making sure the API results are cached well and don’t contain timestamps in the URL. If you don’t care about the content being returned to Google, you could block the API subdomains from being crawled, but you should test this out first to make sure it doesn’t stop critical content from being rendered. 

John suggested making a test page that doesn’t crawl the API, or uses a broken URL for it,  and see how the page renders in the browser (and for Google).

22 Jun 2022

Are web components and Javascript-only content bad for SEO? (Testing is key!)

One user asked whether web components are bad from an SEO perspective. Most web components are implemented in Javascript frameworks and Google can process most forms of Javascript. John also mentions later in the video that sites that aren’t user-friendly if their JS were to be switched off typically aren’t a problem for Googlebot (as long as the relevant links and content are also available within the source code). However, John would always recommend testing a sample of pages using the URL Inspect tool before assuming your chosen JS frameworks are supported.

13 May 2022

Do Not Rely on 3rd Party Cookies to Render Content

Because Chrome is going to block 3rd party cookies, and Google uses Chrome to render pages, if your site is dependent on third party cookies to render a page’s content then it won’t be seen by Google.

17 Mar 2020

Onclick Load More JavaScript Links Are Not Triggered During Rendering

Google doesn’t trigger load more JavaScript links during rendering, but they use frame expansion to render the pages with an extremely long viewport to see if the page expands the viewport automatically.

21 Feb 2020

Different Rendering Processes are Used When Rendering a Page For Indexing & for Users

Googlebot doesn’t have a specific time when it takes the rendered DOM snapshot used for indexing. The main reason is due to the way Google renders pages, as there are different processes when rendering for indexing compared to when users access a page. This can result in elements on the site being processed differently and it may take longer to render the page for indexing purposes.

7 Feb 2020

JavaScript Redirects Take Slightly Longer For Google to Process Than 301 Redirects

JavaScript redirects take longer than 301 redirects for Google to understand, as the JavaScript needs to be processed first.

22 Jan 2020

Avoid Providing Google with Conflicting Canonical Tags When Working on JavaScript Sites

If you have a JavaScript site, John recommends making sure that the static HTML page you deliver doesn’t have a canonical tag on it. Instead use JavaScript to add it, in order to avoid providing Google with different information. Google is able to pick the canonical up after rendering the page in order to process and use it.

10 Jan 2020

Use Chrome DevTools and Google Testing Tools to Review a Page’s Shadow DOM

There are two ways to inspect a page’s shadow DOM in order to compare it to what Googlebot sees. The easiest way is by using the Chrome DevTools, within the inspector you will see # shadow route which you can expand, this will display what the shadow DOM contains. You can also use any of the testing tools and review the rendered DOM, this should contain what was originally in the shadow DOM.

10 Dec 2019

When Changing Frameworks on a Site Ensure You Incrementally Test to Reduce SEO Impact

When moving a site from HTML to a JavaScript framework, John recommends setting up test pages and using the Google testing tools to ensure that everything on these pages are indexable. Once you have tested these elements, John then suggests taking certain, high traffic, pages on your site, converting them to the new framework and reviewing the effect from changing these pages. It’s best to do this over a period of around a month to ensure there is time for fluctuations to settle down.

29 Nov 2019

Back 1/15 Next