Deepcrawl is now Lumar. Read more.

Web Spam

Web spam is a tactic used by webmasters to manipulate search engines in order to perform better within search results. This may be through spam links, hacked content or malware from third party tags, amongst many other unnatural methods. Our SEO Office Hours notes below cover how Google handles what they deem as spam, along with advice for avoiding it.

Learn more about SEO best practices for website content in Lumar’s Website Intelligence Academy

Spammy backlinks to 404 pages are ignored by default

When asked how to deal with thousands of spammy backlinks, John was keen to reassure users that low-quality sites linking to 404 pages won’t impact your site negatively. Because it’s a 404, Google essentially reads that link as not connecting to anything and ignores it (for the same reason, it’s important to review/redirect links coming from valuable sources to a 404 page). If it’s just a handful of sites providing spammy backlinks to non-404 pages, the recommendation is to set a domain-level disavow.

13 May 2022

Google Wants to Automatically Ignore Unnatural Content

Instead of applying manual penalties for unnatural content, Google wants to develop automatic solutions to ignore anything unnatural, like they already do for unnatural linking, so it won’t harm you and you won’t have to take any action. But in any situation where a penalty would be applied, the reviewer would probably take the time to look at the site.

20 Mar 2020

Reconsideration Requests Can Take a Month to Process

It can take Google up to a month to respond to reconsideration requests, particularly linking related issues. Google doesn’t send warnings first because they want to take immediate action when they find content with a problem.

20 Mar 2020

Reconsideration Requests Are Reviewed in Batches & Grouped by Issue Type & Country

The team reviewing reconsideration requests will review them in batches, and may group requests by issue type, country, and other factors. Once getting through one batch, they will then move on to the next one, and so on.

7 Jan 2020

Submitting Another Reconsideration Request Won’t Affect Site’s Existing Place in Queue

Submitting a second reconsideration request for a website while the original is still waiting to be reviewed won’t move your site up in the queue, nor will it move your site to the bottom of the queue. The best course of action is to wait for the original request to be reviewed, which can take time as queues often form.

12 Nov 2019

Use Testing Tools to Identify if Your Site Has Hacked Content That is Being Cloaked

If you suspect you have hacked content that is currently being cloaked from search engines being able to see, John recommends using testing tools including the inspect URL tool in GSC to identify if Google is finding this content.

18 Oct 2019

Auto-generated Content is Against Webmaster Guidelines

Using auto-generated content, for example spun content, to create text-based pages is considered against webmaster guidelines. This is particularly true if the content created has no value for users or is similar to other content provided elsewhere on the web.

6 Sep 2019

The Request Review Option in GSC Is The Best Way to Inform Google That Content is Legitimate

If Google Search Console is flagging that content appears to be hacked, the request review approach is the best way to inform Google that the content is legitimate.

12 Jul 2019

Rich Snippets Spam Report Tells Google About Manipulated Structured Data

Use the rich snippets spam report form to inform Google about instances of manipulated structured data.

31 May 2019

Web Spam Team Can Issue Targeted Manual Actions Against Pages With Unnatural Linking

The Web Spam team can take targeted manual action against websites with unnatural linking by choosing to disregard links to individual or small groups of pages.

31 May 2019

Back 1/5 Next