Whatever you do, you do at your own risk. I can only recommend and do not claim 100% of the solution much depends on your environment and other settings. I can not guess. Addition of materials and fixing bugs is welcomed

Finished redirects, let's see how robots and metrics will react. And now here's what happened to me.
the task is to configure Nginx to add www everywhere and make an unconditional redirect to this site address, and the second and most important unconditional transfer to the https version.

server {
listen 80;
server_name <ваш_сайт>;
return 301 https://www.$server_name$request_uri;
}
server {
listen 443 ssl http2;
server_name www.<ваш_сайт>;
server_name_in_redirect off;

As to remove under this robot.txt can be from redirect can be to know here: Открываем доступ к robots.txt в NGINX при HTTPS-only

 

Not so much useful in my lines, how many attempt to muster in one place all "hacks", which account for to do.
And so that at us at full HTTPS the file robots was available for robots.a txt with the correct Host Directive must be created. And this is no problem, but if you have as in my case closed access via HTTP more than completely, then how to convey to your robots robots.txt .
At the moment, I have not found anything easier than to open access only to the robot.txt on 80 port for this in section preset https for your domain adding the next:

location / {
            location = /robots.txt {
                    rewrite ^ /robots.txt break;
         }
}

Смотрите также: Redirect 301 C http to https and no-www to www

If you helped the article or information was useful. Gratitude should not know borders

Popular tags