Cosmos-Server
Cosmos-Server copied to clipboard
[FEAT]: robots.txt per routes
What happened?
I’m attempting to set a robots.txt on a Ghost server I have running on my domain, but nothing is working, and I can’t figure out why. No matter what I do, it remains
User-agent: *
Disallow: /
I would like this server to be accessible to search engines, but I cannot set it. I see a recent change to robots.txt was made to Cosmos that produces the same output and want to confirm this is not the cause.
What should have happened?
When I uploaded a robots.txt file to my Ghost theme, it should have loaded, it did not. It may be related to Cosmos, not Ghost.
How to reproduce the bug?
- Load a Ghost server
- Upload a theme with a dedicated robots.txt file
- Load robots.txt
- See
disallow
message
Relevant log output
No response
Other details
I see that a recent feature was added to support robots.txt to prevent the Cosmos server from being visible on search engines, but even after clearing the site’s cache and removing cookies, the old file is still there.
System details
Client:
- OS: Nobara Linux
- Browser: Vivaldi
Server:
- OS: Pop!_OS
- Version: 15.0
Update: I turned on the “Allow search engines to index your server” feature on Cosmos and cleared my cache in Cloudflare and the correct robots.txt appeared. I turned off that checkmark and cleared my cache again, and the disallow showed back up.
This is really a feature that should be by domain, because there are some sites I want accessible and others I don’t.
I will probably add a per domain checkbox yes
Good call—thank you!