kserve
kserve copied to clipboard
What is the best way to multi model serving on local storage?
I executed Kserve successfully with my custom pytorch model(custom torchserve image) and local storage(PVC).
And I am trying to run multi model serving with same above environments, but I found current model-mesh only supports S3-based storage.
- Is there any method to open model-mesh with local storage now?
- If not, What is the best way to multi model serving on local storage(without model-mesh, such as using many InferenceServices?)
This is my settings:
- Container: custom container based on torchserve-kfs:0.5.3-gpu
- Storage: PVC
- Model: MAR file from my custom pytorch model
- Kserve==0.8.0
I need your help! Thanks!
For this I'd recommend setting up local storage that is S3 compatible like Minio 👍🏼