DYDAS Professional Website Design

DreamHost Blocks Googlebot, Hillarity Ensues

By on May 18, 2007

DreamHost EmailIn a move that may be deemed as the all-time most ridiculous response to server performance by a hosting company, DreamHost has manually blocked Googlebot from many of it’s high traffic accounts via .htaccess files. This has been confirmed in an email sent to the owner of Zoso.ro, a popular Romanian blog. As well as in a DigitalPoint forum topic entitled “Is DreamHost stupid of am I missing something?” by user Johan-cr who explains that he too received the same email.

Here is a transcript of the email sent to DreamHost accounts:

This email is to inform you that a few of your sites were getting hammered by Google bot. This was causing a heavy load on the webserver, and in turn affecting other customers on your shared server. In order to maintain stability on the webserver, I was forced to block Google bot via the .htaccess file.

order allow,deny
deny from 66.249
allow from all]

You also want to consider making your files be unsearchable by robots and crawlers, as that usually contributes to high number of hits. If they hit a dynamic file, like php, it can cause high memory usage and consequently high load…

As of yesterday, it has been reported that DreamHost corrected the issue and removed the block from several .htaccess files, stating that it was no longer necessary. The company most likely had the best of intentions to improve server stability for websites that had excessive CPU usage. Many, including myself, find it quite ironic and funny mainly because of the buzz focused around SEO by today’s webmasters.


About Mark Fulton

Mark is the Founder of DotSauce Magazine and a full time web developer, domain investor, SEO and online marketing professional residing in North Carolina, USA. Visit MarkFulton.com for information on freelance website development, SEO and consultation services.