spatie-crawler-toolkit-for-laravel icon indicating copy to clipboard operation
spatie-crawler-toolkit-for-laravel copied to clipboard

uniqueId will cause the job not finished...

Open vFire opened this issue 4 years ago • 3 comments

When I set the uniqueId in the queue job file, it won't finished anymore... The queue will be blocked ...

vFire avatar Feb 22 '21 10:02 vFire

Hello @vFire

could you provide an example of your code for this? Which versions are you running?

Cheers, Peter

spekulatius avatar Feb 22 '21 15:02 spekulatius

Hi Spekulatius,

Thanks for your quickly response, it's because of the local .env file and app config, run in browser and local command sometimes will mix up the key in redis ... I've fixed it, I realized that when the queue is running, even I define it in a laravel job, the queue is still running independently, so if I want to stop the queue, I have to kill the worker directly, I just simply define 100 urls each queue to process, and schedule the job every one minute to run. That works for me in real business, although I think still need some adjustment in further to improve the efficient when more and more sites need to be crawed parallelly.

If you know any better solution, pls let me know, much appreciated!

vFire avatar Feb 24 '21 08:02 vFire

Hey @vFire

sounds like you found a solution. Maybe you could ensure the job isn't run double? https://laravel.com/docs/8.x/queues#preventing-job-overlaps this might help.

Cheers, Peter

spekulatius avatar Feb 24 '21 16:02 spekulatius