Web Spam
Web spam is a tactic used by webmasters to manipulate search engines in order to perform better within search results. This may be through spam links, hacked content or malware from third party tags, amongst many other unnatural methods. Our SEO Office Hours notes below cover how Google handles what they deem as spam, along with advice for avoiding it.
Learn more about SEO best practices for website content in Lumar’s Website Intelligence Academy.
Spammy backlinks to 404 pages are ignored by default
When asked how to deal with thousands of spammy backlinks, John was keen to reassure users that low-quality sites linking to 404 pages won’t impact your site negatively. Because it’s a 404, Google essentially reads that link as not connecting to anything and ignores it (for the same reason, it’s important to review/redirect links coming from valuable sources to a 404 page). If it’s just a handful of sites providing spammy backlinks to non-404 pages, the recommendation is to set a domain-level disavow.
Google Wants to Automatically Ignore Unnatural Content
Instead of applying manual penalties for unnatural content, Google wants to develop automatic solutions to ignore anything unnatural, like they already do for unnatural linking, so it won’t harm you and you won’t have to take any action. But in any situation where a penalty would be applied, the reviewer would probably take the time to look at the site.
Reconsideration Requests Can Take a Month to Process
It can take Google up to a month to respond to reconsideration requests, particularly linking related issues. Google doesn’t send warnings first because they want to take immediate action when they find content with a problem.
Reconsideration Requests Are Reviewed in Batches & Grouped by Issue Type & Country
The team reviewing reconsideration requests will review them in batches, and may group requests by issue type, country, and other factors. Once getting through one batch, they will then move on to the next one, and so on.
Submitting Another Reconsideration Request Won’t Affect Site’s Existing Place in Queue
Submitting a second reconsideration request for a website while the original is still waiting to be reviewed won’t move your site up in the queue, nor will it move your site to the bottom of the queue. The best course of action is to wait for the original request to be reviewed, which can take time as queues often form.
Use Testing Tools to Identify if Your Site Has Hacked Content That is Being Cloaked
If you suspect you have hacked content that is currently being cloaked from search engines being able to see, John recommends using testing tools including the inspect URL tool in GSC to identify if Google is finding this content.
Auto-generated Content is Against Webmaster Guidelines
Using auto-generated content, for example spun content, to create text-based pages is considered against webmaster guidelines. This is particularly true if the content created has no value for users or is similar to other content provided elsewhere on the web.
The Request Review Option in GSC Is The Best Way to Inform Google That Content is Legitimate
If Google Search Console is flagging that content appears to be hacked, the request review approach is the best way to inform Google that the content is legitimate.
Rich Snippets Spam Report Tells Google About Manipulated Structured Data
Use the rich snippets spam report form to inform Google about instances of manipulated structured data.
Web Spam Team Can Issue Targeted Manual Actions Against Pages With Unnatural Linking
The Web Spam team can take targeted manual action against websites with unnatural linking by choosing to disregard links to individual or small groups of pages.