Deepcrawl is now Lumar. Read more.
DeepcrawlはLumarになりました。 詳細はこちら

Webinar Recap: Making JavaScript Work for Search with Martin Splitt

Lumar Webinars - Learn SEO and Digital Marketing

For the latest edition in our DeepCrawl webinar series, we invited Martin Splitt, Developer Advocate on Google’s Webmaster Trends Analyst team, to join Ashley Berman Hale and answer audience questions and explain the finer details of JavaScript and the impact it has on search engines.

It was really special to be able to have Martin join us and share his JavaScript experience and advice first-hand with our audience, and we’ve covered all of his key points in this recap post so no one will have to miss out.

You can watch the full recording of the webinar here:

We’d like to say a massive thank you to Martin for taking the time to join us and answer our audience’s questions. We’d also like to thank Ashley Berman Hale for doing a fantastic job of hosting the webinar, everyone who submitted questions for Martin, as well as everyone who tuned in live. We hope you gained some really valuable insights into JavaScript rendering that you can apply to your own websites!


 

Has there been any progress on updating Google’s WRS to run alongside the Chrome release schedule?

In Martin’s words, “we’re getting there.” Keep an eye out for upcoming Google announcements around this by checking the upcoming Google I/O and Chrome Dev Summit talks, as well as the Google Webmasters Twitter account. Martin wants this to happen as much as SEOs do, and he will get the news out about this as soon as he can. Progress is looking good so far though.


 

Can you rank server-side, client-side, hybrid and dynamic rendering from best to worst?

  1. Hybrid rendering – This method has a healthy mix of speed, performance and user experience. You get content into the first wave of indexing with server-side rendering, then JavaScript runs on top of this client-side to enhance the user experience. Hybrid rendering can be hard to implement on existing projects, so to help developers implement it make sure you point them to Next.js if they’re using React, Angular Universal for Angular, and Nuxt.js for Vue.
  2. Server-side rendering – Content is available straight away for search engine crawlers. However, it can be a complicated process to decide when server-side rendering should be used. For example, should it only be used when a user clicks on something? You may need to incorporate caching to help with the implementation.
  3. Dynamic rendering – This method is purely a workaround for search engine crawlers. The search engine’s request will be sent to a dynamic renderer which will send it back the rendered content. However, the user will still have to render content client-side so you don’t get any of the user experience benefits with dynamic rendering like you do with hybrid rendering.
  4. Client-side rendering.

Bear in mind that all of these rendering methods require code changes.


 

What’s the basic tool set needed to cover the fundamentals?

The Mobile-friendly Test is a good tool to use early on in the development process for checking JavaScript issues and errors. SEOs should be following the development process from the beginning to be running these tests and help spot issues.

Google Search Console, Chrome DevTools and Lighthouse are also good for running tests with.


 

What’s the best or safest JavaScript framework for getting content indexed?

React and Angular have the highest developer adoption and developer satisfaction rates. A more up-and-coming framework that’s growing in popularity is Vue. Martin has analysed all three of these frameworks and they all work. None of them are inaccessible to search engine crawlers and shouldn’t be hurting your SEO efforts.

A note on Angular: If Angular sites aren’t showing up in search or the Mobile-friendly Test, this could be because Angular has a particular polyfill that’s disabled by default which needs to be enabled. Talk to your developer about doing this.

A note on React: React doesn’t need any additional polyfills to be enabled, but you need to use the React Helmet extension to serve page titles and meta descriptions.

A note on Vue: Vue defaults to hash routes but this is a simple one-line code change. Also use the vue-meta extension to serve page titles and meta descriptions. There are upcoming videos planned for Martin’s JavaScript SEO video series which will tackle each of these frameworks in more detail, so keep an eye out for those.


 

Is it recommended to build smaller sites with HTML & PHP rather than JavaScript?

This decision should be based on what you’re trying to do with the website, not the size of the site. The outcome of this choice won’t matter to Google, so it depends on your developers.

JavaScript is only necessary for highly dynamic, interactive sites where content automatically updates (e.g. price changes) and can be changed based on things like user interaction. If you’re building highly dynamic elements that are changing constantly then JavaScript web apps make a lot of sense here because they can handle very complicated business logic.

A static site that changes its content infrequently and only changes its content through manual publishing doesn’t need JavaScript. You can use static site generators for these kinds of websites.


 

How long does it take for extremely dynamic sites to be rendered by Google?

This depends on how Google rates the quality of your website and what its crawl budget is like. There is mainly an issue if you have more dynamically-changing pages than Google’s overall crawl budget allocation for your site.

It’s difficult to put a hard date on rendering because this process relies on crawling first, and existing crawling queues still apply. If you increase Google’s ability to crawl your website more quickly, this will naturally decrease the time it takes for rendering.


 

What are polyfills? If you’re using JavaScript, do you need to worry about them?

A polyfill provides new features that the browser doesn’t have natively, making them available for the browser.

Polyfills have their own issues and caveats to be aware of. For example, the Shadow DOM Polyfill had a large file size of around 500kb, so you’ll need to test things like this to make sure they’re not contributing too much to page weight.

Polyfills provide a good experience for both users and search engines though, so overall they’re good to use.


 

If Google fails to render a page, is there a risk of it seeing duplicate content?

It would be strange if this happened for a Single-Page Application (SPA). If for some reason JavaScript generates content with no HTML, Google will ignore it and think it’s a pointless, empty page rather than a duplicate one.


 

How can you make sure analytics can track JavaScript?

Each JavaScript framework has a router which decides what content to run and show based on the URL it sees. Make sure you find the central point of integration in your router where a route is changed from A to B and the content is changed, then edit this piece of code to fire a page view event to Google Analytics. There is documentation in Google Analytics for this and around tracking SPAs.

To prevent this from happening with server-side rendering, only render the content that’s needed on a URL basis.


 

What are the main items that should be server-side rendered?

The main content, structured data, page titles, meta descriptions, canonical tags, hreflang and date annotations. Structured data can be difficult to server-side render but try and get this into the static content.


 

What is your advice on generating and pre-rendering highly personalised content?

The first question is, will Google see this content? No, because Google is stateless and doesn’t store cookies. Also, highly personalised content isn’t very useful to display in the SERPs.

For dynamic rendering it can take up to 12 seconds to generate content and Google uses page load performance as a ranking factor, so this could be an issue. Maybe choose to just pre-render crucial pages to reduce rendering time and costs.


 

What are your thoughts on handling 404s generated by JavaScript?

404s can happen when the JavaScript within a web app hijacks the navigation through its push states or service worker caching. This means that the pages you’re navigating to appear to be working for you, but if anyone else tries to access those links, including Google, they will see an error. This is an issue that often causes SEOs to mistrust JavaScript and assume that JavaScript-powered websites can’t be indexed. Check your server configuration to resolve this.

Another thing to bear in mind is that some cloud hosting providers don’t support SPAs and will always serve a 404 error. However, they do allow custom 404 pages which can have content added to them. If you add content to these pages, they will work for users but they won’t show up in search because the server still responds with a 404 error. Make sure your pages have a 200 status code.


 

How can you automate testing to see if JavaScript-powered content is visible to Googlebot?

There is nothing to offer around this right now. You could use the Google Search Console API to see how many pages are being indexed, and which ones are being skipped or have errors.

Google is looking into providing more support on this. However, Google wants to avoid opening up rendering as an API to give these insights, because this would eat into resource that Google needs for rendering itself. Once Google’s WRS catches up with Chrome it will become easier to use tools like Puppeteer for this kind of analysis.


 

A lot of sites have scripts that, when rendered, insert iframes into the. Does this cause specific issues?

HTML and browsers are designed in a way that allows you to make mistakes. An iframe can’t belong in the head because it is a visual element and belongs in the body. So, rather than panicking and breaking the entire page, the browser will see an iframe in the head and assume the body has started.

This means that some browsers will completely ignore titles or other important tags if they are below an iframe in the head. Make sure hreflang is included in the head and isn’t being mistakenly read as being in the body.

Be careful with 3rd party scripts and where they’re adding tags, and remember, iframes never belong in the head.


 

What do you think is the potential cause for the divide between SEOs and developers?

Martin has seen firsthand that both sides have been burned by bad actors on the other respective side.

Also, developers can be resistant to changing the way they work because they need to be able to work as efficiently and productively as possible, so they usually stick with the tools they’re familiar with rather than learning new tools or different ways of working. This is a real problem in the developer community because efficiency isn’t the most important thing; user experience is. Developers should take the time to learn what makes a good web experience for the user, rather than sticking with what they know.

However, there is a real interest in the developer community around technical SEO and discoverability. No developer that cares about their work wants to work on a website that no one will see.

In the past, bad SEOs have spread fear and uncertainty around search issues by making claims without facts. Make sure you back up your requests with proof to avoid developers investing in projects that have no real benefit for performance, and could even backfire. Martin is trying to bridge this gap by providing more documentation on what actually works for Google so the good actors in both communities have the facts to be able to work together towards the right goals.


 

Sign up for our next webinar on dashboarding for enterprise SEO with Nick Wilsdon

DeepCrawl webinar on dashboarding with Nick Wilsdon

If you’ve been enjoying our monthly webinars, then we’ve got many more to come. The next one will be with Nick Wilsdon who will be explaining how you can create and use effective dashboarding to win in enterprise SEO.

Newsletter

Get the best digital marketing & SEO insights, straight to your inbox