Deepcrawl is now Lumar. Read more.
DeepcrawlはLumarになりました。 詳細はこちら

Google Webmaster Hangout Notes: April 3rd 2020

Lumar - SEO and Digital Marketing Industry News

Notes from the Google Webmaster Hangout on the 3rd of April 2020.


The Disavow File Happens on a Per URL Basis

When Google recrawls individual links on a site, if they’re mentioned in the disavow file then Google will essentially drop the link. It’s not something that site owners need to wait for, it happens incrementally.


If You Need to Temporarily Shut Down Your Website due to Covid-19 Ensure You Are Not Returning a 503 Code

An important thing to note from an SEO standpoint is that you’re not taking the pages or website down and also not returning a 503 status code. Doing this will cause Google to drop the pages from the search results and potentially from the index completely.


Google Isn’t Specfically Filtering Out Sites Around Health Terms

John isn’t aware of anything that Google is filtering out sites specifically around health terms. There are high expectations from users for the content listed in the search results. So it’s not about filtering things out, but instead making sure that the algorithms are working as expected to get the most relevant content in front of users.


Google Is Able To Discover and Crawl JavaScript Links

If link is injected by JavaScript, Google can discover and crawl it. If you are using something else like an analog button or any kind of antique handler, then Google cannot follow those.


If Your Site Has Issues With Keyword Canibilisation John Recommends Focusing Efforts on One Page

When dealing with issues around keyword cannibalization, John recommends focusing efforts on one page. The more pages there are, covering the same topic, the more difficult it is for Google to pick one of the pages. It also makes it more difficult for the user to figure out which page they should be looking at. Also, it will be more difficult to maintain multiple resources on your site.


Google Will Always Be Clear When Googlebot is Crawling A Site

It is possible that a Google employee could visit a site via a browser, in this case it wouldn’t show up as Googlebot. However, when it’s the bot crawling, they always declare Googlebot as they want to be really clear when they visit and index pages.


Anchor Text Helps Provide More Context To A Page

Google could combine link anchor text with the page content and get a better understanding of the link in general.


Google Is Able To Understand Videos In a Number of Formats

Google is good at understanding video on a page, any format should be fine as long as Google can get to the video. There is also schema which you can use to explicitly tell Google about the video content.


Ensure Your Meta Descriptions Are Unique and Match The Page’s Intent

There are 2 main instances where google would overwrite the meta descriptions. Firstly Google tries to identify when there is irrelevant meta descriptions on the page. Alternatively when the description doesn’t match at all what the user is searching for, but the page content does match. John recommends making sure descriptions are unique, short, and match the page content and searchers intent.


Be the First to Know About the Latest Insights From Google

Hangout notes

Loop me in!

Avatar image for Ruth Everett
Ruth Everett

Technical SEO

Ruth Everett is a data & insights manager at Code First Girls, and a former technical SEO analyst at Lumar. You'll most often find her helping clients improve their technical SEO, writing about all things SEO, and watching videos of dogs.


Get the best digital marketing & SEO insights, straight to your inbox