URL blacklist to prevent waste of tokens
Duplicates
- [X] I have searched the existing issues
Summary 💡
As Auto-GPT is a work in progress, it's usual to start over again and again. In this process, the same sites are returned by Google that provide in some cases no real value. I would appreciate it if such URLs could be manually. This would also help to prevent loops in some cases.
In theory, also automatically adding sites to such a list would make sense, but it needs to be used carefully. For automatically adding, a setting could be used e.g. to add sites returning HTTP errors (4xx, 5xx).
Examples 🌈
In some cases, sites are accessed which return a 404 or contain a totally different topic. It makes no sense to try to access such sites again.
Motivation 🔦
I love Auto-GPT, but I hate to see it running over the same sites again and again, providing no value and costing tokens. It would increase the efficiency and effectivity of Auto-GPT if the usage of a URL blacklist is possible.