opencommit icon indicating copy to clipboard operation
opencommit copied to clipboard

[Bug]: Unable to use oco with ollama running mistral

Open jagadish-k opened this issue 1 year ago • 13 comments

Opencommit Version

3.0.11

Node Version

18.15.0

NPM Version

9.5.0

What OS are you seeing the problem on?

Mac

What happened?

I am unable to use opencommit to generate commit message for my staged files using locally running ollama.

I get the following error :

 ✖ local model issues. details: connect ECONNREFUSED ::1:11434

Expected Behavior

I expect opencommit to work with locally running ollama

Current Behavior

I am running ollama with mistral in one terminal. ollama run mistral In the other terminal where I have staged files, I was able to curl to it

curl http://127.0.0.1:11434
Ollama is running%

This is the config I have

OCO_OPENAI_API_KEY=undefined
OCO_TOKENS_MAX_INPUT=undefined
OCO_TOKENS_MAX_OUTPUT=undefined
OCO_OPENAI_BASE_PATH=undefined
OCO_DESCRIPTION=false
OCO_EMOJI=false
OCO_MODEL=gpt-3.5-turbo-16k
OCO_LANGUAGE=en
OCO_MESSAGE_TEMPLATE_PLACEHOLDER=$msg
OCO_PROMPT_MODULE=conventional-commit
OCO_AI_PROVIDER=ollama

Possible Solution

No response

Steps to Reproduce

No response

Relevant log output

> OCO_AI_PROVIDER='ollama' opencommit
┌  open-commit
│
◇  30 staged files:
...
◇  📝 Commit message generated
│
└  ✖ local model issues. details: connect ECONNREFUSED ::1:11434

jagadish-k avatar Mar 14 '24 04:03 jagadish-k

@jaroslaw-weber hi man, could you take a look please?

di-sukharev avatar Mar 15 '24 07:03 di-sukharev

@jagadish-k could you try with other OCO_MODEL config? now it's set to gpt-3.5-turbo-16k which is not mistral

di-sukharev avatar Mar 18 '24 04:03 di-sukharev

Same for me even with OCO_MODEL set to mistral

❯ OCO_AI_PROVIDER="ollama" OCO_MODEL=mistral opencommit
┌  open-commit
│
◇  2 staged files:
  .zshrc
  Makefile
│
◇  📝 Commit message generated
│
└  ✖ local model issues. details: connect ECONNREFUSED ::1:11434

SebastienElet avatar Mar 27 '24 10:03 SebastienElet

Hi there, I am also getting this same error. I think the doc should add some more details on setting an ollama model with open-commit

I have this bare minimum config & ollama is running fine on port 11434 on localhost

OCO_AI_PROVIDER='ollama'
OCO_DESCRIPTION=false
OCO_EMOJI=false

Abir-Tx avatar Apr 09 '24 09:04 Abir-Tx

Guys I have found a fix for this. You see the oco is requesting the ollama API on ipv6 loopback which is ::1 and by default ollama doest not listen on ipv6 address.

So to fix the issue you can run ollama on ipv6 too or in my case I have configured ollama to listen on all interfaces by setting the OLLAMA_HOST='0.0.0.0 env variable.

@di-sukharev I can enhance the documentation on this by adding guide for setting up this if you like so I will submit a PR.

I hope this will help thanks

Abir-Tx avatar Apr 09 '24 10:04 Abir-Tx

@Abir-Tx please do enhance the documentation when you have time for a PR and thank you for the help ❤️

di-sukharev avatar Aug 18 '24 11:08 di-sukharev

Guys I have found a fix for this. You see the oco is requesting the ollama API on ipv6 loopback which is ::1 and by default ollama doest not listen on ipv6 address.

So to fix the issue you can run ollama on ipv6 too or in my case I have configured ollama to listen on all interfaces by setting the OLLAMA_HOST='0.0.0.0 env variable.

@di-sukharev I can enhance the documentation on this by adding guide for setting up this if you like so I will submit a PR.

I hope this will help thanks

Currently trying to get opencommit to run in a github workflow action, where would I set this variable?

Paficent avatar Aug 22 '24 14:08 Paficent

Guys I have found a fix for this. You see the oco is requesting the ollama API on ipv6 loopback which is ::1 and by default ollama doest not listen on ipv6 address.

So to fix the issue you can run ollama on ipv6 too or in my case I have configured ollama to listen on all interfaces by setting the OLLAMA_HOST='0.0.0.0 env variable.

@di-sukharev I can enhance the documentation on this by adding guide for setting up this if you like so I will submit a PR.

I hope this will help thanks

added export OLLAMA_HOST='0.0.0.0' then closed ollama, re-open it and worked

victorbiga avatar Aug 26 '24 19:08 victorbiga

@victorbiga @Abir-Tx thank you guys, do you want to make a PR for the fix?

di-sukharev avatar Aug 27 '24 07:08 di-sukharev

@victorbiga @Abir-Tx thank you guys, do you want to make a PR for the fix?

Yeah I would love to. I will try to submit a PR with enhanced documentation as soon as I get some time. Thank you

Abir-Tx avatar Aug 27 '24 11:08 Abir-Tx

@Abir-Tx nice, let me know if you need any help ❤️

di-sukharev avatar Sep 06 '24 11:09 di-sukharev

Thank you @di-sukharev. Give me some time I will soon submit the PR. A bit busy right now

Abir-Tx avatar Sep 08 '24 08:09 Abir-Tx

Problem: I am unable to use opencommit to generate commit message for my staged files using locally running ollama.

BUG: ✖ Failed to generate the commit message Error: Ollama provider error: Invalid URL

Opencommit Version 3.2.2

Node Version 20.14.0

NPM Version 10.8.3

Local Machine OS Verson Windows 11 (23H2)

Ollama Model Version mistral

SOLUTION: - In Terminal 00. ollama run mistral 01. git add 02. oco config set OCO_AI_PROVIDER='ollama' OCO_MODEL='mistral' 03. oco config set OCO_API_URL=http://localhost:11434/api/chat

KNIGHTCORE47 avatar Sep 22 '24 17:09 KNIGHTCORE47