I recently had an issue with robots.txt file and Google Search Console. It was showing a 404 error message. The problem was related to Nginx and Yoast SEO plugin. In my case, robots.txt is virtual and generated by Yoast SEO plugin. Nginx throws a 404 error because it can’t find the physical file, but then a request for robots.txt is detected by Yoast SEO plugin and the content is generated. However, Google bot stops at 404 error.
Basically, what you need to do is to add these lines to Nginx server blocks:
1 2 3 4 5 |
location = /robots.txt { try_files $uri $uri/ /index.php?$args; access_log off; log_not_found off; } |
This means, Nginx will look for robots.txt through PHP and will not try to find a physical file.
Thank you. This worked for me.