No one wants to have their website banned from search results or even knocked down in ranks and loose precious traffic. But Google is king and if you do not play by their rules then you will get the wrong end of the stick.
In this article I will describe some of the common practices that get penalized by Google and ways to recover from them. If you require a more personal view of your site then I am happy to provide individual feedback so contact me here.
Now before we go on with some of the most common penalties & ways to fix them, first you need to understand the system that Google uses. There are 2 penalties you can get on your site. There is the Algorithmic Penalty and then there is the Manual Penalty.
Now the Algorithmic penalty is by far the most common and it usually is the one connected to drop of rankings. This is a Googlebot algorithm and is completely automated, it comes in the form of various animals such as Panda’s, & Penguins to name a few. These are updates that Google lunches to improve their spam detection algorithms and keep us on our toes.
The Manual penalty is exactly what the term implies, it is a manual review of your site by Google staff which can result in a lower ranking or a complete removal from Google index.
Now that you know the basics of the penalties that Google uses, lets get on with some of the most common reasons that sites get penalized in the first place.
Unnatural Links are the main reason that people loose rankings or worse. In Google’s eyes having bad or paid for links is the worst thing you can possibly do because you have not “earned” them. Trying to manipulate your rankings by paying for links is something that you should never do, and if you ever meet any SEO professional to tell you otherwise then do not get tricked into letting them do these “black hat” techniques on your site because you WILL get penalized. Your links should not only appear natural but they should be natural. Google’s head of spam has said numerous times that the idea behind link building should be hard work combined with incredible content that gets people to link naturally to your site’s content.
Recovering from unnatural links is quite straight forward.
Knowing what links are unnatural. After you have a list of all the links pointing to your site, you can choose to download it as a CSV file and then you can begin to see what sites are there that can potentially hurt your rankings. A very good tip here is to try to avoid sites that have some variation of the following as a domain name “easyseo, fastseo, paidseo” I say this because there is no such thing as easy or fast SEO.
After you have identified the links that are harmful to your site you can request from that sites owner to remove the link pointing to your site. If that has no results then you can do a disavow request and submit it to Google.
Thin or Low Quality Content
Something else that will also impact on loosing ranking is content. Content is king, and having a successful content strategy for your site could mean the difference between first page position and nowhere to be found position.
Recovering from low quality or thin content takes hard work and constant updating.
When I speak about thin content I don’t just mean its 3 lines per page. That of-course is a big factor, but I also reefer to content that is unoriginal. You can use tools such as Copyscape to determine if the content you have on your page is a duplicate of anything else out there.
After you have determined if your content is unique or not, then comes the constant updating of it. For this I recommend a creative mind, and at least 2-3 hours to spare every couple of days.
Keep track of your rankings and progress with each content update you do. This way you can determine which will be suited to your business and which gives the most benefit to your rankings.
Multiple Pages With Same “Title”
I keep seeing this one. People think that if they create several pages with slightly altered content & same title they will have a better chance of ranking for those keywords. You wont. you will have a better chance of getting a lower rating though.
Never make multiple pages for the same content/keywords you want to rank.
Again much like the links. Fixing this is straightforward.
Identify the pages that are internally duplicated, you can use Webmaster Tools for this as well under “HTML Improvements” or you can again use Copyscape.
You can try several fixes for this problem. The easiest one is to just add a “noindex” tag on the page that has the duplicated content. The next time Googlebot crawls your site it will see the noindex and drop that duplicated page off the index thus allowing for the original page to rank higher. Alternatively you can add a “nofollow” which functions similar to the noindex tag or add a canonical URL pointing to the original page from any duplicated page on your site.
Submit all the URL’s that you have made changes to (added noindex, nofollow or canonical URL) to Webmaster Tools using the Fetch as Google tool located under the Crawl tab.