Robots.txt file global /disallow
-
We setup this plugin on a number of new domains and after a few days noticed that none of them were being indexed. On closer checks we found that the generated /robots.txt files had populated as
User-agent: *
Disallow: /The Robots.txt file via is not available to edit via FTP nor was there any settings in the plugin that direct it to block the entire site.
Perhaps a permissions issue where the plugin can’t write to the sever? We’ve manually added a robots.txt file and it appears to be okay now but adding Robots Meta Settings options are not being applied.
Viewing 3 replies - 1 through 3 (of 3 total)
Viewing 3 replies - 1 through 3 (of 3 total)
The topic ‘Robots.txt file global /disallow’ is closed to new replies.