Block Crawlers from HTTPS with Robots.txt

If you allow Googlebot and the other search engine crawlers to index both HTTPS and HTTP versions of your website, you're going to run into some SEO problems. Some search engines see and as different pages, creating duplicate content issues and decreasing each page's worth because their incoming link strength (etc.) is, in effect, divided by two.

I was recently advised by SEO consultants I was working with on a site to simply block crawler access to the HTTPS version of the site to clear this up. The problem is, there is nowhere in robots.txt to specify how to behave based on whether the connection is secure or not. And the filepath for both and was /export/ on my server (the same file). So, I had to get creative...

First, I created a file called robots.php in /export/ like the following:

1.   <?php
2.   header("Content-Type: text/plain; charset=utf-8");
3.   if ($_SERVER['SERVER_PORT'] == 443) {
4.   	echo "User-agent: *\n" ;
5.   	echo "Disallow: /\n" ;
6.   } else {
7.   	echo "User-agent: *\n" ;
8.   	echo "Disallow: \n" ;
9.   }
10.  ?>

Then, I modified the .htaccess file so that robots.php would be called instead of robots.txt:

1.   RewriteEngine On
2.   RewriteBase / 
4.   RewriteRule ^robots.txt$ /robots.php [L]

I was leery about putting the disallow all in there, but I confirmed on on Google help that this was the appropriate way to do it. And, after implementing, we saw great results. The HTTPS versions disappeared from the rankings and their HTTP counterparts rose quickly.

Note: Another way of doing this would be to use .htaccess to show another text file (like robots_https.txt) instead when accessed via HTTPS. I like the PHP approach, though, because it offers more flexibility for other needs (like disallowing the site to be crawled on the dev server, but allowing it on the live server).



This post was published on December 3rd, 2010 by Robert James Reese in the following categories: .htaccess, PHP, robots.txt, and SEO. Before using any of the code or other content in this post, you must read and agree to our terms of use.