Deepcrawl is now Lumar. Read more.
DeepcrawlはLumarになりました。 詳細はこちら

Google Webmaster Hangout Notes: October 30th 2018

google webmaster hangouts recaps - SEO tips

Notes from the Google Webmaster Hangout on the 30th of October 2018. During this Hangout, John was joined by Martin Splitt, Developer Advocate at Google, who answered questions around JavaScript.


Large Sites with Frequently Changing Content Should Use Dynamic Rendering Rather Than Client-side

It is recommended for websites with fast-changing content and large websites to implement dynamic rendering rather than client-side rendering. Client-side rendering can cause a delay in indexing and can also UX issues, especially on mobile.

If you want to check to see how your website renders for search engines, find out about DeepCrawl’s JavaScript rendering capabilities.


Client-side Rendering Doesn’t Work for Facebook & Twitters’ Crawlers

Be mindful that other crawlers, specifically the ones used by Facebook and Twitter, don’t support client-side rendering. Any Open Graph Tags or Twitter Cards implemented with JavaScript need to be server-side rendered or dynamically rendered.


Google Can Process JavaScript Redirects as Long as it Can Crawl Them

JavaScript redirects don’t usually cause any problems for Google as long as it can crawl them, and they are treated as regular redirects. Make sure these redirects aren’t disallowed, however, as Google won’t be able to process them.


Scroll Events Shouldn’t be Used in Isolation to Execute Lazy-loading

Scroll events aren’t always the best solution because they are expensive, users on desktop may resize their window to get more content which wouldn’t trigger a scroll event, and Google doesn’t scroll. Test lazy-loading is working by using Fetch & Render and Intersection Observer.


Website Owners Don’t Need to Specify What Google Should & Shouldn’t Render

Website owners don’t need to implement anything on their sites that would tell Google what is unnecessary to render, as it should be Google’s job to figure this out. Being selective and not rendering particular elements can also cause problems for a website.


Don’t Serve Critical JavaScript in the Head as This Can Block Rendering

Any JavaScript that is deemed to be critical will most likely be of a significant file size, so shouldn’t be served in the head as this can delay rendering, meaning the user will have to wait longer before seeing any content. Serve priority content to users as quickly as possible without JavaScript if possible.


Using HTTP/2 Doesn’t Reduce Resource Cost for Google Rendering JavaScript

Google will still have to parse, compile and execute JavaScript after the initial transfer which are the three most expensive elements of rendering, so serving JavaScript via HTTP/2 doesn’t reduce cost.

To learn more, read the recap of our webinar on HTTP/2 with Tom Anthony of Distilled.


Google Won’t Fetch Third-party Scripts that Aren’t Deemed to be Useful

Google is getting better at recognising third-party scripts that aren’t useful for it to fetch and render, so will avoid fetching those where it can.


Google Can Discover URLs for Crawling if They Are Included in Full as Links within JavaScript

JavaScript links aren’t the same as HTML links, but if you include a full URL within a JavaScript link then Google will try to follow it.


Localising URLs Doesn’t Cause a Problem for Google

If you decide to translate and localise the words in your URLs for different language versions of a site, then this won’t break anything from an SEO perspective.

Make sure you read DeepCrawl’s Ultimate Guide to International SEO to learn more about localisation and internationalisation strategies.


Geotargeting Can’t be Applied to Language Versions Which Are Split Out Using Parameters

Google is unable to automatically detect and apply geotargeting if different language or country versions of a site are separated out using parameters.


When Using Subfolders for International Site Structure, Have Country Before Language

If you are using subfolders for different country and language variations, these need to be verified separately in Google Search Console. Having the country subfolder before the language subfolder will make things much easier.


Include Content That Matches Old Website Versions When Merging Sites

When merging multiple sites into one, John recommends including some content on the destination domain that matches the original domains that have been merged.


Merging Internal or External Pages on One Topic Will Result in Higher Rankings

If separate sites that rank well already and both focus on the same topic or service merge, then they will see an increase in rankings as Google sees an even stronger page than before.


When Users Set Up Google Analytics Account, Google Will Automatically Set Up Search Console Account

Google will automatically add a Search Console account for you when you set up a Google Analytics account.


The Search Console Update was Around Filtering Out Very Specific Queries for User Privacy

The reason why there appears to be a decrease in performance data in Search Console is because it was updated to filter out very specific user queries and the performance data associated with them.

Learn more about how to combine Google Search Console data with crawl data in our guide to leverage more insights into your site’s performance.


Be the First to Know About the Latest Insights From Google

Avatar image for Rachel Costello
Rachel Costello

SEO & Content Manager

Rachel Costello is a former Technical SEO & Content Manager at Lumar. You'll most often find her writing and speaking about all things SEO.


Get the best digital marketing & SEO insights, straight to your inbox