If you weren’t able to join us for the webinar you can find the recording and key takeaways in our recap post here.
Does the 5 second rule still matter for Googlebot?
Although there is no exact timeout for Googlebot, many experiments (including ours at Elephate) show that Google GENERALLY doesn’t wait more than 5 seconds for a script. I recommend making your scripts as fast as possible.
John Mueller agrees: “5 seconds is a good thing to aim for, I suspect many sites will find it challenging to get there though”
Technical aspects aside, I don’t think this is a discussion worth focusing on. If your website has a script that loads 5 seconds, you are not going to rank anyway.
WordPress themes have a lot of client-side JS, how should you deal with this?
I would approach it the same way as other JS-based websites.
Make sure that Google can see the menu items/content hidden under the tabs. Can Google see the entire content copy or just an excerpt?
Also, if you keep enabling more and more plugins, the process of JS execution may take a lot of time, which will have a negative effect on both Google and users. Apart from JS rendering, it’s generally good practice to use only necessary plugins, in order to keep WordPress lean and mitigate the risk of hacking.
What are some good future steps for an SEO in terms of JS?
It means being able to foresee potential obstacles like timeouts or content that requires user interaction, as I’ll talk about later.
Would you want to update your JS crawl experiments for Chrome 66?
Google has made claims in the past about being able to render JS quite well. So hamburger icons and “read more” buttons shouldn’t be a problem, right?
The first rule you should follow is to avoid event-based scenarios. If content or links are injected to the DOM after a click of a button, Google will not pick it up.
“If you have an online store and content hidden under a “show more” button is not apparent in the DOM before clicking, it will not be picked up by Google. Important note: It also refers to menu links.”
So, click the right mouse button -> Inspect (or CTRL + Shift + J) and see if the menu items are apparent there. If not, you can be almost sure Google will not pick up your hamburger menus.
How does the second phase of rendering affect the implementation of structured data using Google Tag Manager?
The chances are great that Google will not immediately pick up your dynamically injected markup; this is related to two-phased indexing.
In your case, Google will be able to see your structured data when rendering resources become available.
Diagnosing structured data injected by Google Tag Manager is difficult since Google’s tools are not yet ready for auditing it. The Google Structured Data testing tool struggles with JS. Whereas the Rich Results testing tool can deal with it, but it only supports a few schema types.
You should ask your developers!
If a site uses 5 unique templates using JS, is it necessary to check issues concerning crawling and rendering for the entire site? Can one get fairly good sense of issues by sample crawling and rendering of these five templates?
Generally, you should be fine with crawling a sample of URLs. Just make sure that the sample is big enough to reflect the whole website.
If your website is relatively small, ie. has 50 pages, then take your time to analyze the whole website. If there are millions of URLs, sampling is OK. For each unique template pick X URLs and analyze them.
If one uses React + Webpack to create an isomorphic JS web application to serve pages to crawlers and end-users, what type of issues does one need to worry about in this type of setup where all pages are generated by the server?
Isomorphic JS is the recommended solution, but still, there are a lot of things you should pay attention to:
- Check if Google can see your pagination, menu links, content hidden under tabs or “tap to read more” buttons.
- Make sure the whole DOM is not recreated once a page is fetched (it may happen if the re-hydration doesn’t work properly).
- Perform a mini SEO audit of your website. There may be many non-JS based SEO issues (that happens in the case of every website).
- Go through your server logs!
How would JS crawling/rendering affect organic performance for sites with short-lived content?
What would you recommend for pre-rendering? Pre-render.io?
It is hard for me to give you a good recommendation here. I think Prerender.io is an OK choice. If you use it, make sure you specify Chrome Headless as a rendering platform (it’s the most recent solution, and is considered much better than PhantomJS).
Pre-rendering is a complex topic as every website is different. Your key focus should be having somebody who can make sure that the implementation of the solution you choose is smooth and efficient.
How difficult is it for a developer to rebuild a client-side JS-rendered page to render only server-side? Is it as hard as rebuilding from scratch?
You mean isomorphic JS? Yes, unfortunately it’s difficult for developers to implement it. Pre-rendering is considered way less complicated from the developer’s point of view.
If you are delivering pre-rendered content to bots, will that still be subject to the two wave rendering process?
If you pre-render the content for Googlebot, it will receive a plain HTML website. So Google can discover your content and links in the first rendering wave.
Do you think the Rich Results and Mobile Friendly testing tools are actually executing JS? I’ve seen they don’t know the same elements as Screaming Frog’s store HTML & JS-rendered HTML
Yes, these tools do execute JS. You can learn more by looking at the table below:
The differences you encountered may be caused by the fact that Screaming Frog uses Chrome 6.x for rendering while these tools use an older solution: Chrome 41. Also, there may be different timeouts in Screaming Frog than in the case of the Rich Results Test/Mobile Friendly test.
I think it should work. However, if you implement such a solution, you should make sure that the pages are INDEXABLE, by default. Then, if applicable, Angular will change the meta robots directives.
Test it using the Mobile-friendly testing tool to make sure Google can deal with it. After implementing such a solution, ensure there are no soft 404 pages indexed in Google.
P.S. Make sure you use the ng-if attribute and not ng-show. If you use ng-show, Google will get conflicting signals.
Are there any major benefits of having Isomorphic JS over Pre-rendering?
Yes, of course!
Although Isomorphic JS is more difficult to implement, it has two major advantages:
- It is the solution that satisfies both users and search engines: users can get a website faster; which is very important, especially on mobile devices. Search engines also get a website that is understandable for them.
- It is easier to maintain as you only need to deal with one version of your website. If you were using pre-rendering, you would have to deal with two versions of your website.
If you’re interested in learning more on this topic, you can read my article on the subject.
How do you deal with #! in url? Is pre-render still effective with the way Google will be crawling with the new AJAX scheme?
Let me start with the second question. Yes, pre-rendering is still an effective technique, as long as you use something called “dynamic rendering”.
John Mueller spoke about it during the Google I/O conference.
Regarding your first question:
Google is getting rid of the old Ajax crawling scheme (hashbangs in URLs, escaped fragments). So if you want Google to get a pre-rendered version of your website you have to make sure you are using dynamic rendering, as explained above.
In your opinion, can an ecommerce site built on a JS framework be competitive in this market? What would you recommend?
A lot of ecommerce websites use JS frameworks. By using techniques like pre-rendering/ isomorphic JS they still can be competitive in the market.
However, if you ask about the client-side rendered JS, there are not many examples of ecommerce websites being successful in Google.
A notable example is Wayfair.com. They are performing relatively well in the search. But if you look deeper, you will easily notice they usually don’t rank for keywords related to particular products. The explanation is simple: sometimes Google can’t get product listings, because it encounters timeouts. If it can’t get product listings, it can’t get products. Then it can’t rank products high and the circle is closed.
How would you run pre-rendering on the server?
It depends on the prerendering service you want to use, but generally, your developers shouldn’t struggle with it.
Here is how to implement pre-rendering using Prerender.io.
“The Prerender.io middleware that you install on your server will check each request to see if it’s a request from a crawler. If it is a request from a crawler, the middleware will send a request to Prerender.io for the static HTML of that page. If not, the request will continue on to your normal server routes. The crawler never knows that you are using Prerender.io since the response always goes through your server.”
In the Prerender.io documentation you can find information on how to install the aforementioned Middleware for Apache, Nginx.
https://groceries.asda.com/ thoughts? Interesting to see no performance in Bing despite the presence of HTML on pages. I’d be interested to see the reasoning between how Google and Bing crawl the DOM and which is crawled first?
Asda.com is a pretty popular website, many people use it in the UK (500k users daily according to Alexa.com). I can easily imagine how much money they are losing due to the fact that they are not accessible for Bing users.
The problem is even bigger in the US market where Bing handles 25% of search queries.
Can you give some insights on the impact of inline and external JS on SEO?
My experiments performed on five test websites showed Google was not following links if they were injected by an external JS file.
So if you’re creating a new JS-based website, you may fall into the same “trap”.
Need an example? Type in Google the following command: site:https://www.wayfair.com “Loading today’s best sellers”. Looking at the massive number of Wayfair’s pages, it’s clear that Google was not able to pick up the products listings.
Want More Like This?