docsearch
docsearch copied to clipboard
Add an option to prevent publishing incomplete indices
Describe the problem
We enjoy using the maxLostRecordsPercentage
option in the safetyChecks
. But sometimes the crawler fails to fetch one or more of our pages, resulting in a lost amount of records that stays below this threshold. The crawler then happily proceeds publishing the incomplete index.
Would love to be able to be looser about losing records to allow for changes in our content, but have a zero-tolerance for failed urls.
Describe the solution
Would love to be able to set something like a maxFailedUrls
in the safetyChecks which blocks the crawler from publishing the index if any amount of urls failed to be crawled.
Additionally, it would be helpful if the crawler could retry fetching urls, with some backoff pattern.
Alternatives you've considered
@Janpot, thanks, this is in the teams backlog.
@Janpot We have recently added maxFailedUrls
safety check. You can now enable it in your crawler configuration and set the minimum threshold for failed urls. You can find the documentation [here].(https://www.algolia.com/doc/tools/crawler/getting-started/crawler-configuration/#safety-checks).
Thank you. Started trying it out on our setup.