docsearch icon indicating copy to clipboard operation
docsearch copied to clipboard

Add an option to prevent publishing incomplete indices

Open Janpot opened this issue 11 months ago • 1 comments

Describe the problem

We enjoy using the maxLostRecordsPercentage option in the safetyChecks. But sometimes the crawler fails to fetch one or more of our pages, resulting in a lost amount of records that stays below this threshold. The crawler then happily proceeds publishing the incomplete index. Would love to be able to be looser about losing records to allow for changes in our content, but have a zero-tolerance for failed urls.

Describe the solution

Would love to be able to set something like a maxFailedUrls in the safetyChecks which blocks the crawler from publishing the index if any amount of urls failed to be crawled. Additionally, it would be helpful if the crawler could retry fetching urls, with some backoff pattern.

Alternatives you've considered

Janpot avatar Mar 21 '24 15:03 Janpot

@Janpot, thanks, this is in the teams backlog.

randombeeper avatar Jul 11 '24 19:07 randombeeper

@Janpot We have recently added maxFailedUrls safety check. You can now enable it in your crawler configuration and set the minimum threshold for failed urls. You can find the documentation [here].(https://www.algolia.com/doc/tools/crawler/getting-started/crawler-configuration/#safety-checks). Screen Shot 2024-12-04 at 1 42 28 PM

TatyanaHerman avatar Dec 04 '24 21:12 TatyanaHerman

Thank you. Started trying it out on our setup.

Janpot avatar Dec 05 '24 12:12 Janpot