Sitemap URls blocked by robots.tx


#1

Uploaded a simple sitemap.txt to my site, and submitted it to Google Search Console Sitemaps. On testing the sitemap, I get “Sitemap contains urls which are blocked by robots.txt.”. Per a Filezilla search of the remote site, the robots.txt file doesn’t seem to exist on my site.


#2

Did you use Filezilla to upload the sitemap.txt? If not how did you upload it?


#3

Yes–SFTP.


#4

what’s the url of your site? Please provide information so that people here can investigate and give you more meaningful suggestions on how to fix this.


#5

http://www.morningmistquiltstudio.com/


#6

So, the file exists on your site and it seems to be blocking (correctly) the Admin interface of wordpress:

http://www.morningmistquiltstudio.com/robots.txt

User-agent: *
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.php

I can’t check the sitemap because it’s not in the default place: http://www.morningmistquiltstudio.com/sitemap.xml


#7

I’m using a sitemap.txt, because I thought it would be simpler and adequate for our mostly static site.


#8

http://www.morningmistquiltstudio.com/sitemap.txt is still 404 (file not found)…


#9

Ok, thanks. I’ll go hunt down that problem.