Disabling indexing of (spam) subdomains

Drupal will respond to any random subdomain like http://dfkasdf.giraffesdoexist.com/ with your normal site. Sometimes spammers and scammers use this feature for whatever reasons they might have and Google (or Yandex along with other search engines) will index these causing confusion, page duplicates and other nasty SEO-unfriendly effects. This can be fixed by modifying robots.txt file on the fly to include Disallow directive as follows:
Disallow: / #disable whole site indexing
On Apache this is done by two tweaks:
  1. Add one line to .htaccess:
    AddHandler server-parsed .txt
    
    This will tell the server to run SSI parser on all txt-files including robots.txt.
  2. In your robots.txt add the following (remember to change the domain name):
    <!--#if expr="%{HTTP_HOST} -strmatch '*.giraffesdoexist.com'" -->
    User-Agent: *
    Disallow: /
    <!--#else -->
    User-agent: Mediapartners-Google
    Disallow:
    # and continue with your normal contents here
    <!--#endif-->
    
Voila! Your spam-target.giraffesdoexist.com/robots.txt should now prohibit indexing of false subdomains.
Rating: