BentoML icon indicating copy to clipboard operation
BentoML copied to clipboard

The easiest way to serve AI apps and models - Build reliable Inference APIs, LLM apps, Multi-model chains, RAG service, and much more!

Results 312 BentoML issues
Sort by recently updated
recently updated
newest added

### Feature request The default scheduling strategy implementation schedules the same number of runner (`nvidia.com/gpu` supported) instances as the number of available GPUs. If multiple types of runners are present...

feature

### Feature request It would be great to add native support for DICOM inputs, there are a lot of ML applications in medical imaging nowadays. Now you would need to...

enhancement
io-descriptor

### Feature request External modules are current not by default pickled with the model. ### Motivation _No response_ ### Other _No response_

feature

Currently there aren't a way to fully support torch hub. https://github.com/bentoml/BentoML/issues/2602 like this often comes up due to different torch hub imports implementation. *Proposal* Provides a `bentoml.torchhub` that create interaction...

Relevant discussions in https://github.com/bentoml/BentoML/issues/666

help-wanted
enhancement
framework

help-wanted
good-first-issue

help-wanted
good-first-issue

- Health checking (https://github.com/bentoml/BentoML/issues/2630) - Start/stop hooks

documentation

[`darts`](https://unit8co.github.io/darts/) is a great library for time series prediction. It would be great if bentoml supports the darts library. Note: darts wraps a bunch of existing (time series) libraries such...

feature
help-wanted
good-first-issue

Adding support for using models trained with Pytorch ignite in BentoML * sample notebook showing how the integration could work * verify that the current `bentoml.pytorch` module can adapt to...

help-wanted
good-first-issue