Deepcrawl is now Lumar. Read more.
DeepcrawlはLumarになりました。 詳細はこちら

Google Webmaster Hangout Notes: Friday 1st July 2016

google webmaster hangouts recaps - SEO tips

Notes from the Google Webmaster Hangout on 1st July 2016, when John Mueller discusses a wide range of SEO topics.

 

Genuine Links From Unrelated Sites are OK

A genuine recommendation link, from a website that isn’t related in content, is fine provided the recommendation is genuine, and doesn’t need to be nofollowed.

 

External Links Don’t Affect Rankings

External linking to relevant or useful sites, doesn’t have any impact on search rankings.

 

Internal Linking of Details Pages is Helpful

Internal cross-linking between your details pages such as product or blog posts, is helpful for search engines to find pages and understand the relationship between pages.

 

Authorship Markup Doesn’t Need to be Removed

Google don’t use authorship markup at all, but you don’t need to remove it until it’s convenient.

 

Googlebot Doesn’t See Robots Meta Tags on Redirected URLs

If a page is a redirected, Google won’t see any robots meta tags on the page, although they might see a noindex in the headers.

 

Some Search Console Reports are Sampled

Some search console reports are based on a significant sample of the primary URLs, and won’t include every possible URL. e.g. Structured data and AMP reports.

 

A JavaScript Modified Head can Break Meta Tags

If things like Hreflang tags in headers are not being picked up, it might be due to a problem with the head, perhaps modified by JavaScript. You can use the Inspect Element tool in Chrome to see the rendered page to validate the header section is correct.

 

Repeated Descriptions Will Result in Snippets

You can use the same meta description on multiple pages, but Google is more likely to generate a snippet from the page’s content.

 

No Penalty for Links to Duplicate Content

Linking to sites with duplicate content won’t ‘usually’ result in a penalty.

 

No Good Solution for Reactivating Pages

If you have pages which expire but are reactivated after a period of time, there isn’t really a good solution, but you can use a Sitemap to tell Google about URLs which are now active, and use the unavailable-after meta tag.

 

Don’t Deliberately Block Pop-ups for Googlebot

If you have a pop-up which doesn’t show in Fetch and Render, then it probably isn’t seen by Googlebot. But if you are using a technique to block this deliberately then it could be interpreted as cloaking and result in a manual penalty in extreme situations.

If Googlebot is able to see the pop-up, it might result in the content on the pop-up being given more weight than the content on the page.

Avatar image for Sam Marsden
Sam Marsden

SEO & Content Manager

Sam Marsden is Lumar's former SEO & Content Manager and currently Head of SEO at Busuu. Sam speaks regularly at marketing conferences, like SMX and BrightonSEO, and is a contributor to industry publications such as Search Engine Journal and State of Digital.

Newsletter

Get the best digital marketing & SEO insights, straight to your inbox