Deepcrawl is now Lumar. Read more.
DeepcrawlはLumarになりました。 詳細はこちら

Google Webmaster Hangout Notes: September 5th 2017

google webmaster hangouts recaps - SEO tips

Notes from the Google Webmaster Hangout on the 5th of September, 2017.

 

Wikipedia and MyBusiness Entries Help Google Understand Brands

It can be tricky for Google to identify brand names from individual words. Making sure a brand is listed in places like Wikipedia page and Google MyBusiness can help Google understand that a search term is a brand name and not individual words.

 

SERPs Usually Restored After Manual Action is Removed

When an onsite manual action has been removed, the site is reprocessed and is usually shown in the same position as it was before. However, if the reason why the site was ranking was because of something against Google’s policy (e.g. spammy links) then it might rank differently.

 

Sites Removed From Index After Manual Action Can Take Two Weeks to be Reindexed

When Google removes a manual action after completely removing a site from the index it can take a couple of weeks to be indexed as whole site has to be recrawled.

 

Algorithms Exist/ed to Check App and Webpage Content Equivalent

Google had (or has) a number of algorithms set up to double check to make sure content app content is equivalent webpage content. John recommends using Firebase for app indexing wher can submit support questions.

 

Aggregator Content Can Outrank Orginal Source if Latter is Lower Quality

Original source of content can be outranked by aggregator sites if former is seen as lower in quality by Google. In this case John recommends working on improving the overall quality of a site to prevent this from happening.

 

Google Folds Together Sites With Same Server, Content & URL Paths

If Google finds sites using the same server and have same content and URL paths, these will likely be seen as identical and folded together in search.

 

GSC Crawl Error Priority Ranked According to User Relevancy

“Crawl error priority ranks errors based on how relevant they are for users e.g. URLs that users are more likely to find on the site.

 

Google Validates Sitemap Files Immediately After Submission

The 50k URL limit for sitemaps is based on the number of entries or elements in the sitemap file (including alternate linked URLs) and this is validated immediately after they are submitted. So if there are too many URLs in the sitemap file, you will be made aware of that straight away.

 

Increase in GSC 403 Errors Could be Due to CMS Bot or Denial of Service Protection

An influx of 403 Forbidden errors in Search Console could be due bot protection or denial of service protection built in by a site’s host or CMS. This might be triggered if Google fetches a lot of pages in a short space of time. John recommends checking server logs to get a better understanding of what is happening. Fetch & Render can be used to check if Googlebot is blocked from specific URLs.

 

Google Spot Checks Parameters Applied in GSC

Google tries to apply parameters set in the URL Parameters section of Search Console, like “Doesn’t affect page content”. However, Google also do spot checks on these URLs, stripping away the parameter and crawling the simplified URL to check that this setting isn’t wrong.

 

No Need to Nofollow Links Between Same Company on Different Domains

Links between separate parts of the same business on different domains don’t need to be nofollowed e.g. a link from your blog to your e-commerce site on a different domain.

Avatar image for Sam Marsden
Sam Marsden

SEO & Content Manager

Sam Marsden is Lumar's former SEO & Content Manager and currently Head of SEO at Busuu. Sam speaks regularly at marketing conferences, like SMX and BrightonSEO, and is a contributor to industry publications such as Search Engine Journal and State of Digital.

Newsletter

Get the best digital marketing & SEO insights, straight to your inbox