My website was crawling by google perfectly until this last February, 2023. The website didn't have any robots.txt until now. Suddenly Page Indexing live test is failing due to this error
Failed: Robots.txt unreachable.
I added robots.txt in my react website with the following robots.txt content.
User-agent: *
Allow: /
I also able to see this robots.txt through my browser after deployment. I am using AWS for the deployment.
myexample.com/robots.txt
But when I try to test the Live Index test, It throws
Failed: Server error (5xx)
Why is it throwing server error after I update robots.txt?
Note: It was working perfectly until this last February without robots.txt. I didn't do any development changes from last December? Why suddenly its showing these error while Google indexing?
I encountered a similar issue with my Next.js 13-based website. I kept receiving a Failed: Server error (5xx) message despite my website functioning properly.
After an extensive search for a solution, I discovered that the problem was caused by my approach to handling internationalization (i18n) on my website.
Specifically, the issue was related to how I was retrieving the locale from the "accept-language" header. To resolve this, I followed Next.js guidelines on handling i18n and implemented their recommended method. This resolved my problem and restored the proper functioning of my website.NextJS internationalization
Although we don't know your exact setup, I hope I have given you some useful guidance or idea where the issue would be.