sidekiq-throttled icon indicating copy to clipboard operation
sidekiq-throttled copied to clipboard

Cannot connect to redis issue

Open dchun opened this issue 2 years ago • 1 comments

Every time heroku runs it's regular maintenance for redis I get a timeout issue. It used to run for a day or so and then go away but now the error wont stop for some reason.

It seems like sidekiq throttled is trying to run a job with the old url prior to heroku updating the redis url after running it's upgrade and I believe that is what is causing the issue.

I tried to kill the jobs by adding sidekiq status and creating an expiration to sidekiq jobs but get the same problem. How would I kill jobs that keep getting the timeout issue?

This is my log:

2022-01-22T17:16:29.482888+00:00 app[worker.1]: pid=8 tid=ouainxnc4 WARN: Redis::CannotConnectError: Error connecting to Redis on ec2-44-194-160-154.compute-1.amazonaws.com:8449 (Redis::TimeoutError)
2022-01-22T17:16:29.482914+00:00 app[worker.1]: pid=8 tid=ouainxnc4 WARN: /app/vendor/bundle/ruby/2.6.0/gems/redis-4.1.3/lib/redis/client.rb:362:in `rescue in establish_connection'
2022-01-22T17:16:29.482915+00:00 app[worker.1]: /app/vendor/bundle/ruby/2.6.0/gems/redis-4.1.3/lib/redis/client.rb:343:in `establish_connection'
2022-01-22T17:16:29.482915+00:00 app[worker.1]: /app/vendor/bundle/ruby/2.6.0/gems/redis-4.1.3/lib/redis/client.rb:106:in `block in connect'
2022-01-22T17:16:29.482916+00:00 app[worker.1]: /app/vendor/bundle/ruby/2.6.0/gems/redis-4.1.3/lib/redis/client.rb:306:in `with_reconnect'
2022-01-22T17:16:29.482916+00:00 app[worker.1]: /app/vendor/bundle/ruby/2.6.0/gems/redis-4.1.3/lib/redis/client.rb:105:in `connect'
2022-01-22T17:16:29.482916+00:00 app[worker.1]: /app/vendor/bundle/ruby/2.6.0/gems/newrelic_rpm-6.7.0.359/lib/new_relic/agent/instrumentation/redis.rb:120:in `connect'2022-01-22T17:16:29.482916+00:00 app[worker.1]: /app/vendor/bundle/ruby/2.6.0/gems/redis-4.1.3/lib/redis/client.rb:381:in `ensure_connected'
2022-01-22T17:16:29.482917+00:00 app[worker.1]: /app/vendor/bundle/ruby/2.6.0/gems/redis-4.1.3/lib/redis/client.rb:231:in `block in process'
2022-01-22T17:16:29.482917+00:00 app[worker.1]: /app/vendor/bundle/ruby/2.6.0/gems/redis-4.1.3/lib/redis/client.rb:319:in `logging'
2022-01-22T17:16:29.482918+00:00 app[worker.1]: /app/vendor/bundle/ruby/2.6.0/gems/redis-4.1.3/lib/redis/client.rb:230:in `process'
2022-01-22T17:16:29.482918+00:00 app[worker.1]: /app/vendor/bundle/ruby/2.6.0/gems/redis-4.1.3/lib/redis/client.rb:125:in `call'
2022-01-22T17:16:29.482918+00:00 app[worker.1]: /app/vendor/bundle/ruby/2.6.0/gems/newrelic_rpm-6.7.0.359/lib/new_relic/agent/instrumentation/redis.rb:74:in `call'
2022-01-22T17:16:29.482919+00:00 app[worker.1]: /app/vendor/bundle/ruby/2.6.0/gems/redis-4.1.3/lib/redis.rb:1956:in `block in zrangebyscore'
2022-01-22T17:16:29.482919+00:00 app[worker.1]: /app/vendor/bundle/ruby/2.6.0/gems/redis-4.1.3/lib/redis.rb:52:in `block in synchronize'
2022-01-22T17:16:29.482920+00:00 app[worker.1]: /app/vendor/ruby-2.6.3/lib/ruby/2.6.0/monitor.rb:230:in `mon_synchronize'
2022-01-22T17:16:29.482920+00:00 app[worker.1]: /app/vendor/bundle/ruby/2.6.0/gems/redis-4.1.3/lib/redis.rb:52:in `synchronize'
2022-01-22T17:16:29.482920+00:00 app[worker.1]: /app/vendor/bundle/ruby/2.6.0/gems/redis-4.1.3/lib/redis.rb:1955:in `zrangebyscore'
2022-01-22T17:16:29.482920+00:00 app[worker.1]: /app/vendor/bundle/ruby/2.6.0/gems/sidekiq-6.0.3/lib/sidekiq/scheduled.rb:18:in `block (2 levels) in enqueue_jobs'
2022-01-22T17:16:29.482921+00:00 app[worker.1]: /app/vendor/bundle/ruby/2.6.0/gems/sidekiq-6.0.3/lib/sidekiq/scheduled.rb:16:in `each'
2022-01-22T17:16:29.482921+00:00 app[worker.1]: /app/vendor/bundle/ruby/2.6.0/gems/sidekiq-6.0.3/lib/sidekiq/scheduled.rb:16:in `block in enqueue_jobs'
2022-01-22T17:16:29.482921+00:00 app[worker.1]: /app/vendor/bundle/ruby/2.6.0/gems/sidekiq-6.0.3/lib/sidekiq.rb:97:in `block in redis'
2022-01-22T17:16:29.482921+00:00 app[worker.1]: /app/vendor/bundle/ruby/2.6.0/gems/connection_pool-2.2.2/lib/connection_pool.rb:65:in `block (2 levels) in with'
2022-01-22T17:16:29.482922+00:00 app[worker.1]: /app/vendor/bundle/ruby/2.6.0/gems/connection_pool-2.2.2/lib/connection_pool.rb:64:in `handle_interrupt'
2022-01-22T17:16:29.482922+00:00 app[worker.1]: /app/vendor/bundle/ruby/2.6.0/gems/connection_pool-2.2.2/lib/connection_pool.rb:64:in `block in with'
2022-01-22T17:16:29.482922+00:00 app[worker.1]: /app/vendor/bundle/ruby/2.6.0/gems/connection_pool-2.2.2/lib/connection_pool.rb:61:in `handle_interrupt'
2022-01-22T17:16:29.482922+00:00 app[worker.1]: /app/vendor/bundle/ruby/2.6.0/gems/connection_pool-2.2.2/lib/connection_pool.rb:61:in `with'
2022-01-22T17:16:29.482922+00:00 app[worker.1]: /app/vendor/bundle/ruby/2.6.0/gems/sidekiq-6.0.3/lib/sidekiq.rb:94:in `redis'
2022-01-22T17:16:29.482923+00:00 app[worker.1]: /app/vendor/bundle/ruby/2.6.0/gems/sidekiq-6.0.3/lib/sidekiq/scheduled.rb:15:in `enqueue_jobs'
2022-01-22T17:16:29.482923+00:00 app[worker.1]: /app/vendor/bundle/ruby/2.6.0/gems/sidekiq-6.0.3/lib/sidekiq/scheduled.rb:78:in `enqueue'
2022-01-22T17:16:29.482923+00:00 app[worker.1]: /app/vendor/bundle/ruby/2.6.0/gems/sidekiq-6.0.3/lib/sidekiq/scheduled.rb:70:in `block in start'
2022-01-22T17:16:29.482923+00:00 app[worker.1]: /app/vendor/bundle/ruby/2.6.0/gems/sidekiq-6.0.3/lib/sidekiq/util.rb:15:in `watchdog'
2022-01-22T17:16:29.482923+00:00 app[worker.1]: /app/vendor/bundle/ruby/2.6.0/gems/sidekiq-6.0.3/lib/sidekiq/util.rb:24:in `block in safe_thread'

And this is my sidekik.rb config file

require "sidekiq/throttled"
Sidekiq::Throttled.setup!

require 'sidekiq-status'
Sidekiq.configure_client do |config|
  config.client_middleware do |chain|
    chain.add Sidekiq::Status::ClientMiddleware, expiration: 30.minutes
  end
end

dchun avatar Jan 22 '22 18:01 dchun

It seems the redis url is being cached somewhere and when the url is updated, throttled jobs are still trying to retrieve items in queue from the old url. Not sure where I can find this.

dchun avatar Jan 25 '22 06:01 dchun

I have refactored fetch class implementation, and removed IPC based on pub/sub. That should eliminate sidekiq-throttled from the equation in the above problem. But will be happy to debug further - will need a bit more details though.

ixti avatar May 30 '23 02:05 ixti