Web Spam
Web spam is a tactic used by webmasters to manipulate search engines in order to perform better within search results. This may be through spam links, hacked content or malware from third party tags, amongst many other unnatural methods. Our SEO Office Hours notes below cover how Google handles what they deem as spam, along with advice for avoiding it.
Learn more about SEO best practices for website content in Lumar’s Website Intelligence Academy.
Report Sites Scraping Your Content to Google on Page-By-Page Basis
If your site has been scraped you can submit a DMCA takedown to the website’s hosting service, and to Google’s legal team can investigate. This is required on page level and cannot be done at a site level.
Doorway’ Pages May Result in a Manual Penalty
A large number of thin pages, with boilerplate content and nothing unique except for a few changed keywords, may be considered doorway pages which could result in a manual penalty from the spam team.
Report Incorrect Structured Data Usage to Google
The Web spam report tool should be used to report when structured data is being used improperly. John reiterated that structured data has no direct impact on ranking positions, only the way Google displays information in search.
Manual Action Penalties can be Applied to Thin, Spun or Aggregated Content
Thin content penalties can be applied to sites manually by the web spam team where the entire site seems to be thin, ‘spun’, or aggregated from other sources without any unique additional value.
Unlinked Landing Pages are Not Spam
Landing Pages which aren’t linked from the main site are not spam but may take longer to be crawled, and without links, Google will have trouble understanding the context, so John recommends they should be linked internally.
Check Hosted Executables for Malware
If you are hosting executable files on your website, you need to be careful to ensure they are not harmful and don’t include malware, using tools like www.virustotal.com.
Descriptions Rewritten by Google Don’t Affect Rankings
If Google is replacing your description tags, it won’t affect your rankings as descriptions are not used, but it says Google doesn’t think your descriptions are relevant for the page.The reasons may be; ontains spammy keywords, too vague, too promotional, doesn’t match the content on the page
Google Recognises Separate Sites on Subdomains
Google has a process to recognise when separate sites are hosted on subdomains, and when they are used for a single website. They will then apply per site e,g, penalties and malware detection, but this can affect the entire domain if they haven’t recognised them as different sites.
Google Uses Spam Reports to Develop Their Algorithms
Google doesn’t respond to invididual spam reports, but will use them to assess new algorithms which can be applied to all websites.
Pages May Rank Despite Spam Signals
Google might be ignoring some spammy elements of a page, but continue to rank it well based on other signals. But don’t assume a spammy technique is always helping.