Latest: Facebook Like Our Facebook Page For More Updates! | Twitter: @Tech_Terms

Thursday 23 June 2011

My website got banned by Google [Part 1]

Hello! I am writing post after long time due to several reasons and one of them is that I was busy to migrate my server to another company because of poor support. In the mean time, one of my friend Muhammad Faisal (Karachi) contacted me and told me something that made me worry.


He told me that his website is not getting any search engine traffic and has dropped about 75% traffic. This was very amazing for me as he worked hard to get his website indexed in search engines and he also got a very good position in SERP (His website was in top 3 results). I checked his website in Google using site: url and found nothing in result and after this I tried to search some phrase relating to his website but the result was same.


I thought that this may be a temporary ban from Google and website will be indexed soon again but that was nothing but a nightmare only. I, then, started to crawl the reasons for this and found lot of causing this problem. If you want to save your website to get banned by the SE then review your website and follow the instructions below.


1. Duplicate Content or Websites


Don't know why people copy someone else data to get their website indexed. Actually, people wake up in the morning, create website and then starting copy data from other website. This is the main reason to get banned by the Google as Google has powerful Algorithms to catch your fraud. Google can easily distinguish between original content and copied one.


If Google finds multiple web pages have the same content they may penalize each website for this. Of course, someone may have copied your content and Google banned you even though it was your original content that was taken. Make sure no other site is using your content. You can do this by performing a Google search using some of your text with quotation marks (") around it. If you do find someone is using your original copy visit here to learn more about copyright infringement: http://www.google.com/dmca.html.


2. Robots and Meta Tags


Yes robots and meta tags can also help your website not to be indexed in search engines because robots.txt can be treated as an Oxygen pipe to your website. So it will better to check your robots.txt file, if you have. Also if you find the text like something this is in your pages.


<meta name="ROBOTS" content="NOINDEX">. If you find something similar to this then your website is blocked to be accessed by search engines.


You can deny and allow access to search engine bots via robots.txt. Here is an example of such 2 tasks.


The line given below will allow all robots to crawl your website


------------------------------


User-agent: *
Disallow:


-----------------------------


This example keeps all robots out:


----------------------------


User-agent: *
Disallow: /


---------------------------


3. Cloaking


Google says: "The term "cloaking" is used to describe a website that returns altered web pages to search engines crawling the site. In other words, the web server is programmed to return different content to Google than it returns to regular users, usually in an attempt to distort search engine rankings. This can mislead users about what they'll find when they click on a search result. To preserve the accuracy and quality of our search results, Google may permanently ban from our index any sites or site authors that engage in cloaking to distort their search rankings."


This is the biggest fraud that is done by the webmasters. In this case a page is shown to user is much more different that is presented to search engine. Actually, this done to rank higher position in search engines and to acheive high PR. Google has latest technology and can easily detect such kind of fraudulent activity. So, keep your website clean from Clocking.


4. Hidden Text and or Links


Search engines hate such kind of fraud and your website can be banned permanently. Not really used frequently but in some cases. Webmasters hide text of a webpage in different ways such like by applying white color to the text while the page have white background or tell the CSS file to not show the text but as we know that search engines are more intelligent as compared to us.


5. Keyword Spamming


Keyword density must not be over crowded as most of the webmaster want to rank higher very quickly and they adopt this technique. A webpage must have a reasonable quantity of keywords that are being used in a phrase. If search engine founds that some words are being repeated in a page then your rank will be reduced and you may get banned.


 


This is the first part of post because I can't write the second part of the post as I am busy in exams. So, as I get free will write second part as well. Your comments are welcomed.

  • MBT Icons and buttons

    Tips And Tricks

    Our resources have been successfully got many tricks and tips and shared here too much. Check Yours!

  • choosing webhost for a blog

    Mobile Zone

    Mobile Reviews Shared with thier latest tricks and tips!

  • SEO Settings for blogger

    SEO

    Learn every single SEO tip that will boost your blog's ranking and organic traffic. We got them all!

  • Blogger widgets and plugins

    Check Softwares

    Why not take a tour of all great softwares? You Name it we have it!

  • become a six figure blogger!

    Get Facebook Tricks

    Learn Facebook Tricks!