eleyzhu

Results 1 issues of eleyzhu

When deploy the LLM with TGI in k8s cluster with the pod like below: apiVersion: v1 kind: Pod metadata: name: text-generation-inference labels: run: text-generation-inference spec: containers: - name: text-generation-inference image:...