Showing posts with label Google-spam. Show all posts

Showing posts with label Google-spam. Show all posts

How to work around Google spam filters - Part V


"Google tries to keep its search results as clean as possible. For that reason, they have a variety of spam filters in their ranking algorithm that try to remove low quality web sites.

If your web site gets caught by one of these filters, it will be very difficult to get high Google rankings. In this

article series, we're taking a look at the 15 most common Google spam filters and how you can get around them.

Duplicate content, false use of robots.txt and Google bowling
The duplicate content filter is applied to web pages that contain content that has already been indexed on other web pages. This can happen if you have multiple versions of the same page on your web site or if you use content from other sites on your own web pages.If the content is already available on another page then it will be difficult to get high rankings for your own web page. If the same content is available on multiple pages then Google will pick only one of them for the search results. Having the same page more than once on your web site might also look like a spamming attempt.

False use of the robots.txt file is not exactly a Google spam filter but it basically has the same effect. While a robots.txt file can help you to direct search engine spiders to the right pages it can also lock out search engines from your web site if you use it incorrectly. Further information about the robots protocol can be found
here.

Google bowling means that competitors use spammy SEO techniques to get your web site out of the search results. These people set up doorway pages with JavaScript redirects, blog spamming, referral spamming, etc.

Although your competitor has set up these spam pages that redirect to your web site, Google might think that it is you who is responsible for these spamming attempts and downgrade your web site. Google claims that external factors cannot influence your rankings on Google. However, some "black hat" SEO'lers offer services that can harm the rankings of your competitors.

How to get around these filters
If you have multiple versions of the same page on your web site (print version, online version, WAP-version, etc.) then make sure that search engines will index only one of them.

You can exclude special web pages from indexing by using a robots.txt file or the Meta Robots tag. IBP's
web site optimization editor allows you to quickly add Meta Robots tags to your web pages.
Double check the contents of your robots.txt file to make sure that you don't exclude search engines by mistake.

If your web site has been hit by Google bowling then the only thing you can do is to
file a reinclusion request.The best way to get high rankings on Google and other major search engines is to use white-hat SEO methods: Optimize the content of your web pages and get high quality inbound links.

How to work around Google spam filters - Part IV

Facts of the week - axendra newsletter
"Google tries to keep its search results as clean as possible. For that reason, they have a variety of spam filters in their ranking algorithm that try to remove low quality web sites.

If your web site gets caught by one of these filters, it will be very difficult to get high Google rankings. In this
article series, we're taking a look a the 15 most common Google spam filters and how you can get around them.

Co-citation, too many pages at once and over-optimization
Google's co-citation filter analyzes the web pages that link to your site. It's actually not a filter but an algorithm that tries to put your web site in a themed context.

If the link to your web site is on a web page that links to web sites that deal with gardening equipment then Google thinks that your web site must also be related to gardening equipment. That means that your web site might be put in the wrong context if the other pages on the linking site are not related to yours.

Google's "too many pages at once" filter tries to find web sites with an unnatural site development pattern. If a web site has too many pages too fast then this filter will be applied. This usually only happens if a web page creates web pages by scraping other people's content or by building keyword-rich web pages through cloaking software.

The over-optimization filter is applied to web sites that try to fool Google by stuffing special keywords in their web pages. If the keyword density is too high, Google will downrank the web page for that keyword.

How to get around these filters
To avoid problems with co-citation, make sure that the links to your web site are on related pages that don't link to every Tom, Dick and Harry.
Your links should be on theme related web pages.
Further information on the effect of co-citation on your search engine rankings can be found in
this article.

If you seriously develop your own web pages without scraping other people's content and if you don't use cloaking software then the "too many pages at once" filter shouldn't worry you at all because it's very unlikely that your web site will trigger that filter then.

Don't over-optimize your web pages and don't stuff keywords on web pages. It's important that the keywords for which you want to get high rankings on Google are
listed with the right density in the right elements on your web pages."

How to get around Google's spam filters - Part III

Google tries to keep its search results as clean as possible. For that reason, they have a variety of spam filters in their ranking algorithm that try to remove low quality web sites.

If your web site gets caught by one of these filters, it will be very difficult to get high Google rankings. In this article series, we're taking a look a the 15 most common Google spam filters and we'll tell you how you can get around them.

Google's -30 filter, the Google bomb filter and the page load filter

Google seems to apply the -30 filter to web sites that use spammy SEO methods. If Google finds out that your web site uses invisible text, JavaScript redirects, doorway pages or similar spam techniques then your rankings will drop by 30 spots.

The Google bomb filter seems to be applied to web sites that get too many identical links in a short time period. If a web site gets many links with exactly the same link text then Google will downrank the page because such an unnatural linking behavior indicates a manipulation attempt.

The page load filter is not exactly a filter. Nevertheless, it can affect your Google rankings. If your web site takes too long to load then the search engine spiders will time out and continue with the next web site in the list. That means that your web site won't be indexed and that it won't appear in Google's result pages.

How to get around these filters

If the -30 filter has been applied to your web site then you must remove the spam elements from your web site. After removing the spam elements from your site, send a reinclusion request to Google.

It is very important that all spam elements have been removed from your site before contacting Google. Otherwise, the reinclusion request won't work. Use white-hat SEO methods to optimize your web pages.

If the Google bomb filter has been applied to your web site then you also have to file a reinclusion request. However, it is better to avoid that Google applies that filter to your site. Try to get high quality inbound links with similar but varying link texts. These links will tell search engines that your web site is relevant to a special topic.

If you want to avoid that a slow loading web page prevents search engine spiders from indexing your page, make sure that you have a reliable web host. If your web host offers 99% uptime then this means that your web site can be down for nearly 4 days per year. If search engines try to index your site when it is down then it will be removed from the index.

Next week, we'll take a look at three more Google filters that can cause ranking problems for your web site.

15 Google spam filters and how to avoid them - Part II


Google tries to keep its search results as clean as possible. For that reason, they have a variety of spam filters in their ranking algorithm that try to remove low quality web sites.

If your web site gets caught by one of these filters, it will be very difficult to get high Google rankings. In the next articles, we'll take a look a the 15 most common Google spam filters and we'll tell you how you can get around them.

Google's link farm filter, the broken link filter and the too many links filter

Google heavily relies on inbound links to determine the position of a web page in the search results. To make spamming as difficult as possible, Google also have a variety of link filters to make sure that only the right links are considered.

Participating in a link farm system won't increase your search engine rankings. Actually, you can hurt your rankings if you link to a link farming scheme.

The broken link filter is not actually a filter but the effects are the same. If you have broken links on your web pages and if not all pages on your web site can be found through links on your web site then Google cannot index all of your pages.

Many broken links also indicate that your web site is not very professional and that it should not be listed in the top results.

The too many links at once filter is applied when your web site gets very many links in a short time period. Too many links at once can lead to problems with all big search engines, not just Google.

How to get around these filters

Don't use link farms to get links to your web site. Better use a serious tool
to get high quality links to your web site. Do not use black-hat techniques or link spamming to avoid getting caught by Google's too many links at once filter.

To make sure that the broken links filter is not applied to your web site, make sure that all links on your web site are intact and use a
sitemap so that search engines can find all pages on your web site quickly and easily.

Next week, we're going to take a look at three more Google filters that might prevent your web site from getting high rankings on Google.

15 Google spam filters and how to avoid them - Part 1

from facts of the week: Axandra newsletter 12 February 2007

"Google tries to keep its search results as clean as possible. For that reason, they have a variety of spam filters in their ranking algorithm that try to remove low quality web sites.

If your web site gets caught by one of these filters, it will be very difficult to get high Google rankings. In the next articles, we'll take a look a the 15 most common Google spam filters and we'll tell you how you can get around them.

Google's Sandbox, Google's Trust Rank and Google's domain age filter

These three Google filters all take a look at the age of a web site. Many web sites don't get very old. For that reason, Google implemented a filter that prevents new web sites from getting high rankings for competitive search terms. That filter is called Google's sandbox.

Google's TrustRank filter is closely related to that filter. Web sites with a high TrustRank get high rankings on Google. The TrustRank of a web site is determined by the age of a web site, the quality of inbound links and the contents of a web site.

The domain age filter is another filter that considers the age of your web site. Web sites with old domain names are more likely to get high rankings for competitive keywords on Google.
Further information about the sandbox, Google's TrustRank and domain name filters can be found in these articles:

How to get around these filters

It's not easy to get around these filters. As they all consider the age of your web site, you basically have to wait some time until Google releases your web site from the sandbox.
The best thing that you can do is to work on the content of your web site to show Google's web page spider that your web site is a valuable resource for your topic. Make sure that your web pages are relevant to your search terms.

In addition, get good inbound links to increase the TrustRank of your web site. The better the links to your web site, the higher your TrustRank and the higher your web site will rank on Google.

Next week, we're going to take a look at three link filters that Google uses to remove web sites from the search results."