Deepcrawl is now Lumar. Read more.
DeepcrawlはLumarになりました。 詳細はこちら

Google Webmaster Hangout Notes: March 22nd 2019

google webmaster hangouts recaps - SEO tips

Notes from the Google Webmaster Hangout on the 22nd of March 2019.

A Small Proportion of Thin Pages Is Not an Issue

Thin content is a normal occurrence on websites and shouldn’t be considered a critical issue if it only impacts a small proportion of pages e.g. large news publishers may have some shorter articles which still provide unique content.


Valid Structured Markup Is Not Necessarily Sufficient for Rich Results

Rich results won’t necessarily appear just because Google’s testing tools say that the structured markup on that page is valid. John recommends reading the relevant developer documents to find out about other requirements for specific rich results.


Personalization is Fine For Google But US Version Will be Indexed

It is fine to personalize content to your users, but it is important to be aware that Googlebot crawls from the US and will index the content it crawls from the US version of the page. John recommends having a sizeable amount of content that is consistent across all versions of the page if possible.


Google Is OK with Domains Served from Different IP Addresses

Google is fine if domains aren’t always reachable from the same IP address. This is common with many sites that use CDNs where users from different locations are being dynamically allocated to different servers.


Soft 404s Cannot Be Passed to Other Pages via Redirects or Canonicals

A soft 404 cannot be passed on to another page via a canonical or redirect. Google ignores the content on a page if it detects that it is a 404 or soft 404.


Google Will use Other Canonicalization Factors If the Canonical Is Noindex

Google would receive conflicting signals if a canonical points to a noindex page. John suggested that Google would rely on other canonicalization factors in this scenario to decide which page should be indexed, such as internal links.


Google No Longer Uses Rel=Next/Prev to Understand Pagination

Google no longer uses rel=next/prev to understand pagination. John stated that there are no changes that SEOs need to make if pagination is working properly in search, as long as internal linking is in place.


Google Doesn’t Treat Paginated Pages Differently From Any Other Pages

John explained that Google treats paginated pages in the same way as any other page on a website. While Google does try to understand how pages fit into the context of a website as a whole, they do not apply an attribute to pages to indicate it is a paginated page.


Rel=next/prev May Still Be Useful Even Though No Longer Used by Google

Although rel=next/prev is no longer used by Google, John doesn’t recommend removing it from a site as it is still used by some browsers for prefetching and by other search engines.


Use Crawlers Like DeepCrawl to Understand Which Pages Can be Crawled

John recommends using crawlers like DeepCrawl and Screaming Frog to understand which product pages Google can crawl on an ecommerce site.


Google Can Index Pages Blocked by Robots.txt

Google can index pages blocked in robots.txt if they have internal links pointing to them. In a scenario like this, Google will likely use a title from some the internal links pointing to the page, but the page will rarely be shown in search because Google has very little information about it.


Sites Ranking for Unrelated Queries May Be Hacked

If a site starts ranking for completely unrelated queries that it doesn’t have content for, it is worth checking what content is being shown to Google, via the Inspect tool, to see the site has been hacked.


Reduce No. JavaScript Files if Googlebot Executing JavaScript on Server-side Rendered Sites

If Googlebot is crawling a lot of JavaScript on a server-side rendered site, then it is worth checking if the JavaScript is referred to on the page and if Googlebot is executing this. John recommends using fewer JavaScript files, caching and rendering the page completely so there aren’t any references to unnecessary JavaScript.


Google Displays Publish Dates & Times Incorrectly if Receives Inconsistent Data

Google may display published date and time incorrectly in search results if inconsistent data is provided onpage and in structured markup. Commonly websites will provide a time in the structured data which refers to a different timezone.


Be the First to Know About the Latest Insights From Google

Hangout notes

Avatar image for Sam Marsden
Sam Marsden

SEO & Content Manager

Sam Marsden is Lumar's former SEO & Content Manager and currently Head of SEO at Busuu. Sam speaks regularly at marketing conferences, like SMX and BrightonSEO, and is a contributor to industry publications such as Search Engine Journal and State of Digital.


Get the best digital marketing & SEO insights, straight to your inbox