Sitemap URls blocked by robots.tx

Uploaded a simple sitemap.txt to my site, and submitted it to Google Search Console Sitemaps. On testing the sitemap, I get “Sitemap contains urls which are blocked by robots.txt.”. Per a Filezilla search of the remote site, the robots.txt file doesn’t seem to exist on my site.

Did you use Filezilla to upload the sitemap.txt? If not how did you upload it?

Yes–SFTP.

what’s the url of your site? Please provide information so that people here can investigate and give you more meaningful suggestions on how to fix this.

http://www.morningmistquiltstudio.com/

So, the file exists on your site and it seems to be blocking (correctly) the Admin interface of wordpress:

http://www.morningmistquiltstudio.com/robots.txt

User-agent: *
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.php

I can’t check the sitemap because it’s not in the default place: http://www.morningmistquiltstudio.com/sitemap.xml

I’m using a sitemap.txt, because I thought it would be simpler and adequate for our mostly static site.

http://www.morningmistquiltstudio.com/sitemap.txt is still 404 (file not found)…

Ok, thanks. I’ll go hunt down that problem.