django-robots
django-robots copied to clipboard
A Django app for managing robots.txt files following the robots exclusion protocol
Confirm support for Django 5.0
updates: - [github.com/pre-commit/pre-commit-hooks: v4.5.0 → v4.6.0](https://github.com/pre-commit/pre-commit-hooks/compare/v4.5.0...v4.6.0) - [github.com/pycqa/isort: 5.12.0 → 5.13.2](https://github.com/pycqa/isort/compare/5.12.0...5.13.2) - [github.com/PyCQA/flake8: 6.1.0 → 7.1.1](https://github.com/PyCQA/flake8/compare/6.1.0...7.1.1) - [github.com/psf/black: 23.11.0 → 24.8.0](https://github.com/psf/black/compare/23.11.0...24.8.0)
Users there is a need to manage the `Clean-param` section. Since this parameter only supports [Yandex](https://yandex.ru/support/webmaster/robot-workings/clean-param.html?ysclid=luu17zl6mg204390020&lang=en), It was not possible to find a working solution for the administration of this...
Was only needed for django < 3.2 which are all EOL With newer versions of django it's issueing a RemovedInDjango41Warning
django-robots version 5.0 I am using above version of django-robots. When database is SQLite, things are working fine but when the database is MySQL, it gives following error while running...
It would be a lovely addition if there was no longer a need for the Django Sites package. The current Sitemaps package has a brilliant fallback: I think you might...
I've been parsing my own robots.txt file with Python and found an interesting compatibility scenario: If you create multiple Robot records with the same user-agent, they are spaced apart by...
Google and other search engines just look for the robots.txt file on the root directory. > http//example.com/folder/robots.txt > Not a valid robots.txt file. Crawlers don't check for robots.txt files in...
django-robots it generates following by default ``` User-agent: * Disallow: Sitemap: https://mysite.com/sitemap.xml ``` Bing Robots.txt Tester reports error on line 2.
It would be nice if django-robots included a decorator to block robots from views based on User-agent (like `robots.txt`). It would help django apps outright prevent robots - even mis-behaving...