Robots.txt

Disabling indexing of (spam) subdomains

Drupal will respond to any random subdomain like http://dfkasdf.giraffesdoexist.com/ with your normal site. Sometimes spammers and scammers use this feature for whatever reasons they might have and Google (or Yandex along with other search engines) will index these causing confusion, page duplicates and other nasty SEO-unfriendly effects. This can be fixed by modifying robots.txt file on the fly to include Disallow directive as follows:
Disallow: / #disable whole site indexing
On Apache this is done by two tweaks:
    Rating: 
    Subscribe to RSS - Robots.txt