Deepcrawl is now Lumar. Read more.
DeepcrawlはLumarになりました。 詳細はこちら

Google Search Console Tips

Google Search Console (previously called Webmaster Tools) is a free tool provided by Google for website owners to monitor performance and traffic. GSC also offers search engine optimization recommendations and fixes. Our SEO Office Hours recaps below cover a range of advice from Google to help you better understand and get the most out of this fundamental SEO tool.

Google May Crawl Parameter URLs Set to Crawl: None in the Parameter Handling Tool in GSC

Google may still crawl parameter URLs, even if you set the parameter to crawl: none in GSC. If you want to ensure the URLs never get crawled John recommends using the robots.txt file instead.

1 Nov 2019

“Discovered Not Indexed” Pages May Show in GSC When Only Linked in Sitemap

Pages may show as “Discovered Not Indexed” in GSC if they have been submitted in a sitemap but aren’t linked to within the site itself.

29 Oct 2019

GSC Registers Impressions Based on Results Served, Not What is Displayed in the Viewport

Search Console registers an impression based on the search results shown on the page regardless of whether or not it is visible in the searcher’s viewport or not.

29 Oct 2019

Use Testing Tools to Identify if Your Site Has Hacked Content That is Being Cloaked

If you suspect you have hacked content that is currently being cloaked from search engines being able to see, John recommends using testing tools including the inspect URL tool in GSC to identify if Google is finding this content.

18 Oct 2019

Only One Version of Same Content On Different Country Sites will be Indexed & Appear in GSC Performance Reports

If you have the same content on multiple language variation sites, Google will pick one to index but will use hreflang attributes to swap out versions of the page based on a user’s location. However, only the page that has been chosen to be indexed, and used as the canonical, will be displayed in the GSC performance report.

18 Oct 2019

Google Has a Separate User Agent For Crawling Sitemaps & For GSC Verification

Google has a separate user agent that fetches the sitemap file, as well as one to crawl for GSC verification. John recommends making sure you are not blocking these.

1 Oct 2019

The New Fresh Data Seen in GSC will Differ Slightly Until Things Have Settled Down

The new fresher data now available in Search Console is calculated in a slightly different way, in order to get the data into Search Console as quickly as possible. Therefore, John noted there may be slightly different counts from when it is fresh compared to when it has settled down.

27 Sep 2019

When Displaying Rankings For Pages in GSC it Will Count the Topmost Position in that Search Results Page

If a page is shown more than once in search results, for example as a featured snippet and in the fourth position, then GSC will display the topmost position for it. In the example case, it will display the page as position one. However, this will depend on whether you are looking from a URL level or a query level.

20 Sep 2019

When Displaying Page Positions in GSC it will Start at Position One

If your page is being shown as a featured snippet in position zero, it will appear as position one in Google Search Console, as the numbering they use starts at one.

20 Sep 2019

If Content is Loaded After an Interstitial is Shown Google Will Not Be Able to Index the Page

Googlebot does not interact interstitials, e.g. GDPR notice. Instead, it will try to crawl and render the page as it comes. If you are using one on top of the HTML content then Google will still be able to view the rest of the page, however if the content is only loaded after the interstitial is accepted, Google will only be able to see the interstitial and will try to index this, rather than the actual content.

20 Sep 2019

Back 4/23 Next