google-maps-scraper icon indicating copy to clipboard operation
google-maps-scraper copied to clipboard

Option To Not Retry Failed Jobs?

Open RangerVinven opened this issue 6 months ago • 1 comments

Hi, I'm making this automation program that uses the scraper. Speed is quite important to me, but when the scraper has a failed job, it keeps retrying very slowly. I get the following output, for example:

{"level":"info","component":"scrapemate","numOfJobsCompleted":98,"numOfJobsFailed":6,"lastActivityAt":"2025-06-30T09:36:20.868298826Z","speed":"98.00 jobs/min","time":"2025-06-30T09:36:26.66053663Z","message":"scrapemate stats"}
{"level":"info","component":"scrapemate","numOfJobsCompleted":98,"numOfJobsFailed":6,"lastActivityAt":"2025-06-30T09:36:20.868298826Z","speed":"49.00 jobs/min","time":"2025-06-30T09:37:26.660376592Z","message":"scrapemate stats"}

The "jobs/min" keep going down, and the program usually hangs like this for at least a couple minutes. I've fixed this by lowering the -exit-on-inactivity flag, but I'd rather it not wait until that duration is set. I think it'd be a good feature to have a flag like "-dont-retry-failed" or something like that, that just tells the program to exit, without retrying the failed jobs.

RangerVinven avatar Jun 30 '25 09:06 RangerVinven

I have hardcoded the max retries in the code. Will add configuration to customize that

gosom avatar Aug 23 '25 09:08 gosom