batched-fn
batched-fn copied to clipboard
🦀 Rust server plugin for deploying deep learning models with batched prediction
Hi, I'm trying to avoid writing ```rust config = { max_batch_size: 32, }; ``` and use ```rust config = { max_batch_size: max_batch_size, }; ``` instead so that I can adjust...
If I have 2 concurrent requesters which are batched together, how does this know which response to send to which requester?
Hello! thanks for this library 😄 I'm having trouble executing async tasks form within the closure. From what I can tell, the closure cannot be an async function, and the...
Updates the requirements on [flume](https://github.com/zesterer/flume) to permit the latest version. Changelog Sourced from flume's changelog. [0.11.0] - 2023-08-16 Added WeakSender, a sender that doesn't keep the channel open Sender/Receiver::sender_count/receiver_count, a...
Is there any way to make context of batch_fn not be static? I want to load the models after the server has been initialized with a configuration of a list...