9 min read

10 SEO Tools To Help Websites Recover From Google Algorithm Updates

Ten Tools To Recover From A Google Algorithm Update

Since the latest Google algorithm update (Penguin 2.0) was launched on 22nd May, there has been an increase in casualties desperate to find a solution to clean up their site.

If your website has been impacted by a Google algorithm update, check out these 10 SEO tools which can help recover your website’s credibility as well as monitor your website on a daily basis to prevent future penalties.

UNDERSTANDING LINK PROFILES

One of the best ways to recover from a Google algorithm is to closely monitor your website’s link profile. There are several clever platforms available which generate in-depth reports such as backlink reports, anchor text reports, and inbound link analysis to allow owners to gain a full picture of their website’s link profile. Using these platforms on a regular basis will ensure that any problem areas are identified instantly, allowing owners to solve issues before an algorithm hits.

Screaming Frog

Screaming Frog has to be one of the best tools for identifying on-page structural duplications. After a swift audit, the tool will fetch crucial on-page SEO elements and identify anything that looks a little suspicious. The tool crawls the data and submits SEO issues into different categories including titles, meta descriptions, header tags, and page duplications. The owner can export these SEO issues into Microsoft Excel, allowing owners to easily manage any changes that need to made to the site.

There are two packages available, starting with a standard free service offering limited crawl searches. The free option does not allow owners to access the full configuration options available.

The paid service will cost £99 per year providing users to take full advantage of the tool’s configuration options and custom source code feature.

Majestic SEO

Majestic SEO describes itself as being ‘by far the largest index of its kind publicly available.’ The platform has the ability to provide fresh data, updating its index several times in a day.

Majestic’s SEO tools on offer are:

  • Backlink History Checker
  • Site Explorer
  • Link Map Reports
  • Link Intelligence API

For a full list of tools, click here.

Ahrefs

Ahrefs was first launched in 2011 and since then the service has grown hugely, crawling up to 6 billion pages a day! Ahrefs has the fastest index of live backlinks, updating every 15 minutes.

The platform has several tools including:

  • Site Explorer & Backlink Checker
  • SEO Reports
  • Mentions Tracker
  • Backlink Reports
  • Batch Analysis

Learn more about Ahrefs.com by clicking here.

SEOMoz Campaign Tool

SEOMoz is probably the largest SEO community around, providing users with a subscription service which enables website owners to monitor their site closely on a daily basis. The SEOMoz Campaign Tool is an excellent resource which allows users to identify crawl errors, a change in rankings, organic traffic data and export on-page optimisation reports.

FINDING DUPLICATE CONTENT

Google is really cracking down on duplicate content, especially content that has been 100% stolen. Google is able to see which sites are passing content off as their own and will punish them accordingly. Luckily, there are several tools available that can identify duplicate content across the World Wide Web.

Copyscape

This tool identifies where a site’s content is duplicated elsewhere on the web, allowing owners to click through to the exact place where the content is duplicated.

There is a free basic service, however for users looking for more advanced services, there are two paid options available. Copyscrape Premium allows users to submit content to the site to check the originality of the content before making it live on their site. Copysentry constantly monitors the owner’s content, flagging up any website’s which are passing content off as their own.

Copyscape is unique in the fact that it identifies duplicate content, but it also allows users to check their own content for originality before publishing online.

To correctly resolve duplicate content, the owner can either re-write all of their content or contact the site owner asking them to remove and replace their content.

SiteLiner

Copyscape launched a new free service called Siteliner.  Simply by submitting a URL, Siteliner will generate a report which detects duplicate content, broken links and much more within the site’s structure.

Siteliner can crawl up to 500 prominent pages of a website in a short space of time, performing requests for 4 pages at one time. Owners can only submit their site for inspection once every 30 days though.

One unique feature Siteliner offers, is its ability to tell the ‘Page Power’ of each page. The Page Power is the value of how prominently a single page is linked from other prominent pages within the site’s structure. The maximum value is 100.

Plagspotter

Yet another duplicate content finder, Plagspotter provides users with a comprehensive list of URLs which contain content matching their own. The tool provides an in-depth analysis by creating a report page which allows users to identify how much content is duplicated on the page. The tool allows users to check their whole website for duplication, monitoring the website on a daily, weekly and monthly basis ensuring that any duplication is flagged up instantly.

PlagSpotter offers a free unlimited service for individual URL checks as well as three paid subscription services for continuous scanning and monitoring of web content for duplication.

Plagium

Plagium is a plagiarism checking tool which flags up any sources of plagiarism or origination. The tool continues to find new innovative ways to help website owners monitor the originality of their content. Plagium is completely free to use and is now available in several languages.

Google’s Powerful Tools

Google Webmaster Tools (GMT) is the most valuable tool to have under your belt. Aside from being completely free, the resource can troubleshoot for potential problems, help diagnose those problems and make websites more Google-friendly. Here a just a few teasers of what it has to offer:

Spam Warnings – GMT will email the website owner should a major issue arise with their site. Google may give them an indication of how Google will deal with the situation.

Blocked URLs – Sometimes areas of a site will be blocked when they shouldn’t be. GWT generates reports allowing users to see if there are any URLs blocked by robot.txt.

Google Fetch – This application allows the user to verify whether a page is accessible or not. The user will be able to retrieve a page as if they were Google. Google Fetch is beneficial for large sites where errors can occur without users knowing about them.

Indexed Status – GMT can create a report which shows a website owner how many URLs are indexed out of all those Google can find on the website. The report will reveal inconsistencies which will identify issues such as duplicate content or canonical URLs.

Malware – Checking the site’s malware report regularly is crucial. If a user comes across a site and acts suspiciously, for example if they insert a code into a comment page or blog, GMT may see this as malware, posing as a potential threat to your site. When a user visits your site, a malware message may appear telling the user that the website is not safe.

Links to your site – GWT can create an export all of the links pointing to a site. Users are most likely to use this tool to identify whether or not they need to disavow any links pointing to them. For constant evaluations, this is the tool to use.

Internal Links – Internal links are extremely valuable and if internal linking is completed properly, it can really improve a website’s rankings.  The more links a page has pointing to it,  the greater authority it will receive from Google. As with most strategies, internal linking should not be exhausted. Embed internal links within reason to avoid overuse.

Sitemaps – Check for sitemap errors – very high priority!

HTML Improvements – This is a nice little tool which allows owners to identify any duplicate content on their site through the eyes of Google. Owners will be able to see a list of pages which have identical title tags, meta descriptions, etc. This tool can also find canonical URLs.

Disavow Tool

Google has made it very clear that by dumping a load of URLs into the disavow ‘sin bin’ isn’t going to clean up a website’s spammy link profile.

In a video conducted by Matt Cutts, he insists that before using the disavow tool a website owner must complete several manual link removal requests. The disavow tool should only be used if there are only a small fraction of links left to remove.

Matt Cutts recommends completing a link audit using GMT’s ‘Links to your site’ tool. Owners will then be able to document which links may be damaging their site.

Next, it is a case of sending personalised emails to each site owner asking them to remove the link pointing to their site. Be prepared to pay the price though. Webmasters are now charging for a link removal. Never pay for this, simply leave those sites for the disavow tool.

To use the disavow tool, owners need to sign into GWT and upload a file of all the remaining links that need removing which couldn’t be achieved manually. So there you have it! 10 incredible SEO tools to help websites recover from those pesky Panda and Penguin updates. We are a team of  Bristol SEO specialists dedicated to closely monitoring and implementing innovative digital marketing campaigns for all our clients. If you have any questions for us fire them over at [email protected] or call us on 0117 923 2021.

Related posts