modelmesh-serving
modelmesh-serving copied to clipboard
InferenceSerivce must have either the storageUri or the storage.path
Describe the bug
InferenceSerivce must have either the storageUri or the storage.path
To Reproduce Steps to reproduce the behavior:
apiVersion: "serving.kserve.io/v1beta1"
kind: "InferenceService"
metadata:
name: "sklearn-iris-zibai"
namespace: modelmesh-serving
annotations:
serving.kserve.io/deploymentMode: ModelMesh
spec:
predictor:
model:
modelFormat:
name: sklearn
image: kube-ai-registry.cn-shanghai.cr.aliyuncs.com/ai-sample/kserve-sklearn-server:v0.12.0
command:
- sh
- -c
- "python -m sklearnserver --model_name=sklearn-iris --model_dir=/models --http_port=8080"
resources:
limits:
memory: 2Gi
requests:
memory: 100Mi
errors in modelmesh-controller
"error":"failed to fetch CR from kubebuilder cache for predictor sklearn-iris-zibai: the InferenceService modelmesh-serving/sklearn-iris-zibai must have either the storageUri or the storage.path"
The image already contains the data, no need to pull the data again.
Expected behavior
No errors.