Disabling indexing of (spam) subdomains

Drupal will respond to any random subdomain like http://dfkasdf.giraffesdoexist.com/ with your normal site. Sometimes spammers and scammers use this feature for whatever reasons they might have and Google (or Yandex along with other search engines) will index these causing confusion, page duplicates and other nasty SEO-unfriendly effects. This can be fixed by modifying robots.txt file on the fly to include Disallow directive as follows:
Disallow: / #disable whole site indexing
On Apache this is done by a few tweaks:
  1. If you have control over your hosting machine, enable mod_include or include module depending on your Apache version. Something like:
    sudo a2enmod include
    
  2. Add a few lines to .htaccess:
    <IfModule mod_include.c>
       Options +Includes
       AddOutputFilter INCLUDES .txt
    </IfModule>
    
    This will tell the server to run SSI parser on all txt-files including robots.txt.
  3. In your robots.txt add the following (remember to change the domain name):
    <!--#if expr="%{HTTP_HOST} -strmatch '*.giraffesdoexist.com'" -->
    User-Agent: *
    Disallow: /
    <!--#else -->
    User-agent: Mediapartners-Google
    Disallow:
    # and continue with your normal contents here
    <!--#endif-->
    
Voila! Your spam-target.giraffesdoexist.com/robots.txt should now prohibit indexing of false subdomains.
Rating: