taskiq
taskiq copied to clipboard
feat: non-blocking tasks
non-blocking sleep through a temporary increase in the number of running tasks on demand of the user. here is some use cases
- some task A is waiting for result of another task B. User can request a temporary increase in the number of tasks, so this task A does not block the execution of task B because of
max_async_tasks
. Example in tests withmax_async_tasks = 1
- i have a real case scenario where very long running tasks sleeps for a very long time and there is a need to keep their scope, so i increase the max number of running tasks while they asleep
Codecov Report
Merging #128 (a460da4) into develop (e66f3aa) will increase coverage by
4.98%
. The diff coverage is82.40%
.
:exclamation: Your organization is not using the GitHub App Integration. As a result you may experience degraded service beginning May 15th. Please install the Github App Integration for your organization. Read more.
@@ Coverage Diff @@
## develop #128 +/- ##
===========================================
+ Coverage 67.62% 72.60% +4.98%
===========================================
Files 37 43 +6
Lines 942 1387 +445
===========================================
+ Hits 637 1007 +370
- Misses 305 380 +75
Impacted Files | Coverage Δ | |
---|---|---|
taskiq/abc/middleware.py | 100.00% <ø> (ø) |
|
taskiq/abc/result_backend.py | 100.00% <ø> (ø) |
|
taskiq/cli/scheduler/run.py | 0.00% <ø> (ø) |
|
taskiq/cli/watcher.py | 0.00% <ø> (ø) |
|
taskiq/cli/worker/log_collector.py | 100.00% <ø> (ø) |
|
taskiq/cli/worker/process_manager.py | 0.00% <0.00%> (ø) |
|
taskiq/cli/worker/run.py | 0.00% <0.00%> (ø) |
|
taskiq/events.py | 100.00% <ø> (ø) |
|
taskiq/receiver/params_parser.py | 96.87% <ø> (ø) |
|
taskiq/result_backends/dummy.py | 100.00% <ø> (ø) |
|
... and 24 more |
... and 1 file with indirect coverage changes
:mega: We’re building smart automated test selection to slash your CI/CD build times. Learn more
@s3rius are there plans to have this merged / or implemented differently? We have a use-case as well that triggers this that we noticed since we have now implemented the max_async_tasks option.
We will "workarround" by separating them into two different workers for now.
There are plans to implement it differently. There were some attempts.
The main idea of another approach is to not give just a sleep function, but an ability to decrease counter of running tasks by the time of executing a function.
I've created an issue for that problem here: #258.
The problem with this approach is that in most cases we don't use sleep directly, but awaiting something.
Hello there @s3rius ! I changed approach to dealing with idle tasks Now u can do smth like this
async def task(idle: TaskIdler = Depends()):
async with idle():
await smth()
Pls take a look
ps: there are some endless action going on. pls stop it
Hello, @s3rius. Pls take a look at the new implementation
@s3rius A friendly reminder to check this PR :D
Hello there @s3rius! Pls give it a try =)