seldon-core
seldon-core copied to clipboard
An MLOps framework to package, deploy, monitor and manage thousands of production machine learning models
Follow up to [Slack channel question](https://seldondev.slack.com/archives/C03DQFTFXMX/p1659621260141289) with @adriangonz . As part of a project, I have a set of interconnected nodes (each node is a Triton server) in which each...
## Describe the bug We built a custom inference image which can be deployed successfully using SeldonDeployment. Unfortunately the image did not pass our security check: > ** DISPUTED **...
## Describe the bug Configuration `httpPort` and `grpcPort` in `.spec.predictors[].graph.endpoint` do not take effect. ## To reproduce ```yaml apiVersion: machinelearning.seldon.io/v1 kind: SeldonDeployment metadata: name: sklearn-iris spec: predictors: - name: default...
**version** seldon core version 1.13.1 **doubt** at seldon core , deploy seldonployment on kubernetes; 1、 How do I set the timeout period, other than the long time Creating state? 2、Undo...
## Describe the bug As described in the [Custom pre-processors with the V2 protocol](https://docs.seldon.io/projects/seldon-core/en/latest/examples/transformers-v2-protocol.html) notebook, the model is adapted from [Pretrained GPT2 Model Deployment Example](https://docs.seldon.io/projects/seldon-core/en/latest/examples/triton_gpt2_example.html) notebook. However, I tried to...
> describe:update model name, redeploy ,old svc will not be deleted seldon core version 1.13.1 ### First Deployment name is v1 ``` kubectl apply -f -
## Describe the bug In prepackaged multi-model Triton servers the model metadata endpoints in V2 protocol only one of the deployed models endpoints is accessible under `v2/models/${MODEL_NAME}`. This is because...
## Describe the bug [Triton version policy](https://github.com/triton-inference-server/server/blob/main/docs/model_configuration.md#version-policy) is not implemented for access through Seldon core in Seldon prepackaged servers. [v2 protocal](https://kserve.github.io/website/modelserving/inference_api/) has this endpoint with `POST v2/models/${MODEL_NAME}[/versions/${MODEL_VERSION}]/infer`, however, it seems...
Implementation of model management endpoints of Triton. This is partially supported by bypassing the engine with https://github.com/SeldonIO/seldon-core/pull/4216#issuecomment-1192918478 Native support with the Seldon engine would be helpful.
Hello, I have been trying to configure certificates for an SSL endpoint using GRPC transport using the documentation at https://docs.seldon.io/projects/seldon-core/en/latest/examples/seldon_client.html. However, it appears the tutorial linked (https://istio.io/latest/docs/tasks/traffic-management/ingress/secure-ingress/) has been updated...