I’ve had a basic WordPress blog for more than a decade on various hosts and never had an issue with Google search console until switching to HostGator a month ago. Since then, and after authorizing the account in Google search console, I get a can’t fetch for Sitemap and a “Failed: Robots.txt unreachable”.
I’ve created the .txt and .xml through various plugins or manually and have no luck. (I’ve given many days between updating with no luck)
I nuked the whole install and have just the default plug-ins. Still nothing.
The HostGator chat has been little help.
Could it be a server setting on their end I need to specifically ask about?
Threestorks.com/robots.txt
Ps, this isn’t mission critical. Just a blog I write on here and there.
[ad_2]