How to get around Google's spam filters - Part III

Google tries to keep its search results as clean as possible. For that reason, they have a variety of spam filters in their ranking algorithm that try to remove low quality web sites.

If your web site gets caught by one of these filters, it will be very difficult to get high Google rankings. In this article series, we're taking a look a the 15 most common Google spam filters and we'll tell you how you can get around them.

Google's -30 filter, the Google bomb filter and the page load filter

Google seems to apply the -30 filter to web sites that use spammy SEO methods. If Google finds out that your web site uses invisible text, JavaScript redirects, doorway pages or similar spam techniques then your rankings will drop by 30 spots.

The Google bomb filter seems to be applied to web sites that get too many identical links in a short time period. If a web site gets many links with exactly the same link text then Google will downrank the page because such an unnatural linking behavior indicates a manipulation attempt.

The page load filter is not exactly a filter. Nevertheless, it can affect your Google rankings. If your web site takes too long to load then the search engine spiders will time out and continue with the next web site in the list. That means that your web site won't be indexed and that it won't appear in Google's result pages.

How to get around these filters

If the -30 filter has been applied to your web site then you must remove the spam elements from your web site. After removing the spam elements from your site, send a reinclusion request to Google.

It is very important that all spam elements have been removed from your site before contacting Google. Otherwise, the reinclusion request won't work. Use white-hat SEO methods to optimize your web pages.

If the Google bomb filter has been applied to your web site then you also have to file a reinclusion request. However, it is better to avoid that Google applies that filter to your site. Try to get high quality inbound links with similar but varying link texts. These links will tell search engines that your web site is relevant to a special topic.

If you want to avoid that a slow loading web page prevents search engine spiders from indexing your page, make sure that you have a reliable web host. If your web host offers 99% uptime then this means that your web site can be down for nearly 4 days per year. If search engines try to index your site when it is down then it will be removed from the index.

Next week, we'll take a look at three more Google filters that can cause ranking problems for your web site.

0 comments: