batched-fn
batched-fn copied to clipboard
Loading models in the runtime
Is there any way to make context of batch_fn not be static? I want to load the models after the server has been initialized with a configuration of a list of models, so trying to do something like this -
let batched_generate = batched_fn! {
handler = |batch: Vec<Vec<String>>, model: &HashMap<String, SentenceEmbeddingsModel>| -> Vec<Result<Vec<Vec<f32>>, RustBertError>> {
let mut batched_result = Vec::with_capacity(batch.len());
for input in batch {
let result = model.encode(&input);
batched_result.push(result);
}
batched_result
};
config = {
max_batch_size: 1,
max_delay: 100,
channel_cap: Some(20),
};
context = {
models: &self.loaded_models,
};
};
Where self.loaded_models is created in the constructor of the struct but looks like context needs be static. Any thoughts on how to accomplish this?
Hey @diptanu, the objects defined in context
are loaded lazily upon the first request to the batched function, so maybe you could set models
in context
to call a function that checks the configuration (wherever that may be) and then loads the models accordingly?