mlem
mlem copied to clipboard
`build`: more formats to export your model
MLEM could be a powerful tool if you need to distribute your model with different channels and use it in different circumstances (or easily switch between those). This could be a part of Value Props for us. So I think MLEM should be able to convert/build/export MLEM model to any widely used format and back, including:
- [x] Python package
- [x] Docker image
- [ ] MLflow format
- [ ] Seldon-compatible format
- [ ] Pickle
- [ ] .pt
- [ ] .h5
- [ ] apache TVM
- [ ] onnx
- [ ] Nvidia tensorrt
- [ ] Explore integration with Ivy to export to ML frameworks other than the model was trained with
This list is going to be updated. Please feel free to post a comment or upvote the existing ones if you need something we don't support yet :)
Would love https://onnx.ai/
And https://developer.nvidia.com/tensorrt :)
BentoML compatible format? Alternatively, it might be good to contribute a MLEM runner to BentoML?
last comment is related to https://github.com/iterative/mlem/issues/265