k8sgpt
k8sgpt copied to clipboard
feat: add support Ollama backend & bump golang to 1.22
Closes #1064
📑 Description
The Ollama can make it easier for users to interact with K8SGPT. Support Ollama backend with Ollama API. Because Ollama API requires golang v1.22 , so upgrade the golang to v1.22
Usage:
# ./bin/k8sgpt auth add -b ollama -m llama3 -u http://localhost:11434
ollama added to the AI backend provider list
# ./bin/k8sgpt analyze --explain -b ollama
AI Provider: ollama
0: Service haha/dao-2048()
- Error: Service has not ready endpoints, pods: [Pod/dao-2048-69696bf664-kd997], expected 1
Error: Service has not ready endpoints, pods: [Pod/dao-2048-69696bf664-kd997], expected 1.
Solution:
1. Check the pod's status using `kubectl get pod <pod_name> -o yaml`.
2. Verify if the container is running and its logs are showing any errors.
3. If the container is not running, try restarting it with `kubectl exec <pod_name> -- restart`.
4. If the issue persists, check the service's configuration to ensure it's correctly pointing to the pod's port.
Next TODO things:
- [ ] add docs at: https://github.com/k8sgpt-ai/docs/pull/100
- [ ] add feature in the https://github.com/k8sgpt-ai/k8sgpt-operator
✅ Checks
- [x] My pull request adheres to the code style of this project
- [x] My code requires changes to the documentation
- [ ] I have updated the documentation as required
- [x] All the tests have passed
ℹ Additional Information
@yankay this looks similar with localai's backend which utilizes openai's API :thinking:
@yankay this looks similar with localai's backend which utilizes openai's API 🤔
HI @arbreezy
Like OpenAI and AzureOpenAI, they are similar but different projects. https://hyscaler.com/insights/ollama-vs-localai-open-source-local-llm-apis/ .
So, it needs to be implemented as 2 Providers.
ref: LocalAI : https://localai.io/ Ollama: https://github.com/ollama/ollama
How do you think about that :-)
@yankay this looks similar with localai's backend which utilizes openai's API 🤔
HI @arbreezy
Like OpenAI and AzureOpenAI, they are similar but different projects. https://hyscaler.com/insights/ollama-vs-localai-open-source-local-llm-apis/ .
So, it needs to be implemented as 2 Providers.
ref: LocalAI : https://localai.io/ Ollama: https://github.com/ollama/ollama
How do you think about that :-)
Azure OpenAI is slightly different but I get your argument
I don't have a strong opinion on adding another file for Ollama
identical with localai; Ideally we would have a generic 'local' backend which support the OpenAI's APIs
any thoughts on that @AlexsJones @matthisholleville ?
@yankay this looks similar with localai's backend which utilizes openai's API 🤔
HI @arbreezy Like OpenAI and AzureOpenAI, they are similar but different projects. https://hyscaler.com/insights/ollama-vs-localai-open-source-local-llm-apis/ . So, it needs to be implemented as 2 Providers. ref: LocalAI : https://localai.io/ Ollama: https://github.com/ollama/ollama How do you think about that :-)
Azure OpenAI is slightly different but I get your argument
I don't have a strong opinion on adding another file for
Ollama
identical with localai; Ideally we would have a generic 'local' backend which support the OpenAI's APIsany thoughts on that @AlexsJones @matthisholleville ?
Thanks @arbreezy
Ollama has an official go client. https://github.com/ollama/ollama/blob/main/api/client.go If maintainers agree, I can change the code to use it, :-)
If maintainers agree, I can change the code to use it, :-)
@yankay I think this makes more sense, any thoughts on that @AlexsJones @matthisholleville ?
If maintainers agree, I can change the code to use it, :-)
@yankay I think this makes more sense, any thoughts on that @AlexsJones @matthisholleville ?
I agree, thanks
If maintainers agree, I can change the code to use it, :-)
@yankay I think this makes more sense, any thoughts on that @AlexsJones @matthisholleville ?
I agree, thanks
Thanks @AlexsJones @arbreezy
It has been changed to use the Ollama official go client. https://github.com/ollama/ollama/blob/main/api/client.go
Would you please help to review it? :-)
Thanks @AlexsJones @JuHyung-Son for the PR review :-)