k8sgpt-operator
k8sgpt-operator copied to clipboard
[Bug]: Result CR not showing any details under Specs.Details section
Checklist
- [X] I've searched for similar issues and couldn't find anything matching
- [X] I've included steps to reproduce the behavior
Affected Components
- [ ] K8sGPT (CLI)
- [X] K8sGPT Operator
K8sGPT Version
v.0.3.29
Kubernetes Version
v1.29.3+rke2r1
Host OS and its Version
SLES15 SP 4
Steps to reproduce
- Deploy K8sGPT following the Readme.md
- Verify scan result using "kubectl describe result"
Manifest
apiVersion: core.k8sgpt.ai/v1alpha1 kind: K8sGPT metadata: name: k8sgpt-sample namespace: k8sgpt spec: ai: enabled: true model: gpt-3.5-turbo backend: openai secret: name: opemai-secret key: openai-api-key # anonymized: false # language: english noCache: false repository: ghcr.io/k8sgpt-ai/k8sgpt version: v0.3.29
Expected behaviour
Specs.Details section should have populated some details that were queried from OpenAI backend.
Actual behaviour
Specs.Detail section is empty
Output from kubectl describe result jarvis -n k8sgpt
Name: jarvis Namespace: k8sgpt Labels: k8sgpts.k8sgpt.ai/backend=openai k8sgpts.k8sgpt.ai/name=k8sgpt-sample k8sgpts.k8sgpt.ai/namespace=k8sgpt Annotations: <none> API Version: core.k8sgpt.ai/v1alpha1 Kind: Result Metadata: Creation Timestamp: 2024-04-03T12:49:47Z Generation: 1 Resource Version: 3369563 UID: 5345ff82-5e7b-4874-9fbe-218e8c6e1762 Spec: Backend: openai Details: Error: Sensitive: Masked: KlkvdlVI Unmasked: jarvis Text: jarvis has condition of type EtcdIsVoter, reason MemberNotLearner: Node is a voting member of the etcd cluster Kind: Node Name: jarvis Parent Object: Status: Lifecycle: historical Events: <none>
Additional Information
Logs from the k8sgpt POD
{"level":"info","ts":1712149239.235297,"caller":"server/log.go:50","msg":"request completed","duration_ms":1009,"method":"/schema.v1.ServerService/Analyze","request":"backend:"openai" anonymize:true language:"english" max_concurrency:10 output:"json"","remote_addr":"...:59408"} {"level":"info","ts":1712149275.4206443,"caller":"server/log.go:50","msg":"request completed","duration_ms":1007,"method":"/schema.v1.ServerService/Analyze","request":"backend:"openai" anonymize:true language:"english" max_concurrency:10 output:"json"","remote_addr":".**..:49692"} {"level":"info","ts":1712149311.6114714,"caller":"server/log.go:50","msg":"request completed","duration_ms":1007,"method":"/schema.v1.ServerService/Analyze","request":"backend:"openai" anonymize:true language:"english" max_concurrency:10 output:"json"","remote_addr":"..*.:59432"}
Same issue here.
Checklist
- [x] I've searched for similar issues and couldn't find anything matching *
- [x] I've included steps to reproduce the behavior *
Affected Components
- [ ] K8sGPT (CLI)
- [x] K8sGPT Operator
K8sGPT Version v.0.3.29
Kubernetes Version v1.27.12+rke2r1
Host OS and its Version Ubuntu 22.04.04
Local-Ai Version: v2.11.0-aio-cpu
Description
My Result.example.yaml the Spec.Details field does not show details from my local-AI. Also, in the localai pod, the debug log I do not see any HTTP requests when I’m running k8sgpt:v0.3.29 with localai:v2.11.0-aio-cpu & the ggml-gpt4all-j model, both localai & k8sgpt are in the same namespace.
Expected behavior
Spec.Details from the Result CR section should have populated details with responses from Local-AI that were queried to the backend.
Steps to reproduce
I deployed k8sgpt as a helm-chart operator and added this CR:
apiVersion: core.k8sgpt.ai/v1alpha1
kind: K8sGPT
metadata:
name: k8sgpt-resource
namespace: k8sgpt-operator-system
spec:
ai:
enabled: true
backend: localai
model: ggml-gpt4all-j_f5d8f27287d3
baseUrl: http://local-ai.k8sgpt-operator-system.svc.cluster.local/v1
anonymized: true
language: english
noCache: false
version: v0.3.29
Actual behavior
In the Result.example.yaml CR manifest, the Spec.Details field is empty.
Additional Information
However, when I curl to localai, it works just fine, and in localai pod, I can see debug with this request and I can see a response from my localai model ggml-gpt4all-j_f5d8f27287d3
curl http://local-ai.k8sgpt-operator-system.svc.cluster.local/v1/chat/completions \
-H "Content-Type: application/json" \
-d '{ "model": "ggml-gpt4all-j_f5d8f27287d3", "messages": [{"role": "user", "content": "How are you doing?", "temperature": 0.1}] }'
{"created":1712108557,"object":"chat.completion","id":"c264e40f-98a4-4ff3-9acf-9bad12aebd08","model":"ggml-gpt4all-j_f5d8f27287d3","choices":[{"index":0,"finish_reason":"stop","message":{"role":"assistant","content":"As an AI language model, I am doing well. Thank you for asking!"}}],"usage":{"prompt_tokens":0,"completion_tokens":0,"total_tokens":0}}
My log files are attached: kube-rbac-proxy.log k8sgpt.log k8sgpt-manager.log
I just noticed k8sgpt-operator-controller-manager pod logs, from the manager container:
failed to call Analyze RPC: rpc error: code = Unknown desc = failed while calling AI provider localai: error, status code: 520, message: invalid character '<' looking for beginning of value"}
Finished Reconciling k8sGPT with error: failed to call Analyze RPC: rpc error: code = Unavailable desc = error reading from server: EOF
I have deployed k8sgpt-operator v0.1.3 and my Result manifests are no longer created