«

»

Nov 08

nginx – loading a different robots.txt file for a different sub domain

This will useful when web site has more than 2 domains. For an example webs site is configured to use one domain as real/actual domain and other domain use for testing purposes/temporary. But if you use more than 1 domain for web site, web crawlers such as Google bot will penalize the site as it contains duplicate content.
Simplest solution is, use Nginx managed robots.txt file for a different sub domain by checking the domain name.

server {
## some code ....
        listen   80;
        root /path/to/webroot/folder;
        index index.html index.htm index.php;
        server_name realdomain.com testdomain.com;
        

location /robots.txt
        {
        if ($host = 'testdomain.com') {         
        return 200 "User-agent: *\n Disallow: /";
        }
        }

## some code ....
}

Leave Your Thought Here