gitbutler icon indicating copy to clipboard operation
gitbutler copied to clipboard

Ollama branch name generation doesn't work

Open keysmashes opened this issue 1 year ago • 22 comments

Version

Version 0.14.4 (20241213.093904)

Operating System

macOS

Distribution Method

dmg (Apple Silicon)

Describe the issue

Hitting "Generate branch name" with the Ollama backend configured (default settings – endpoint http://127.0.0.1:11434, model llama3) results in the error "Failed to generate branch name: The string did not match the expected pattern".

image

I do not really understand the network requests I'm seeing in devtools, but there's a request to plugin:http|fetch_send with a strange-looking response:

{
    "status": 403,
    "statusText": "Forbidden",
    "headers": [
        [
            "date",
            "Fri, 20 Dec 2024 11:31:26 GMT"
        ],
        [
            "content-length",
            "0"
        ]
    ],
    "url": "http://127.0.0.1:11434/api/chat",
    "rid": 3377205724
}

I've checked that Ollama is actually installed and running:

$ http POST http://127.0.0.1:11434/api/chat model=llama3
HTTP/1.1 200 OK
Content-Length: 137
Content-Type: application/json; charset=utf-8
Date: Fri, 20 Dec 2024 11:31:59 GMT

{
    "created_at": "2024-12-20T11:31:59.41741Z",
    "done": true,
    "done_reason": "load",
    "message": {
        "content": "",
        "role": "assistant"
    },
    "model": "llama3"
}

How to reproduce

Configure GitButler to use Ollama, and try to generate a branch name.

Expected behavior

Ollama should be able to generate a branch name (like e.g. OpenAI).

Relevant log output

No response

keysmashes avatar Dec 20 '24 11:12 keysmashes

Hi! I believe you might be running into an issue related to our app permissions. Tauri has strong sandboxing around what the frontend can access.

Could you try 0.0.0.0 or localhost because I believe we have those whitelist.

Caleb-T-Owens avatar Dec 20 '24 11:12 Caleb-T-Owens

localhost gives the same error; 0.0.0.0 produces a different error: "url not allowed on the configured scope: http://0.0.0.0:11434/api/chat"

keysmashes avatar Dec 20 '24 12:12 keysmashes

Sorry for the trouble! I'll spin up ollama and try it out this afternoon.

Caleb-T-Owens avatar Dec 20 '24 12:12 Caleb-T-Owens

I've had a go running it, and it seems to work in development mode 😬. I'm currently on holiday and debugging differences between development and production builds does not sound super enjoyable, so I'm going to put this onto my todolist for new year.

Sorry for any inconvenience.

Caleb-T-Owens avatar Dec 21 '24 21:12 Caleb-T-Owens

+1 Experiencing the same error

ivanglushko avatar Dec 27 '24 13:12 ivanglushko

+1 same error :)

ikitozen avatar Jan 01 '25 21:01 ikitozen

error + 1

cttaizai avatar Jan 02 '25 09:01 cttaizai

Auto-closed when I merged the likely fix, so re-opening until it has been verified.

mtsgrd avatar Jan 02 '25 14:01 mtsgrd

I'll have a go now :+1:

Caleb-T-Owens avatar Jan 02 '25 14:01 Caleb-T-Owens

By "now" I mean, in 30 minutes time when a nightly build has finished chugging away

Caleb-T-Owens avatar Jan 02 '25 15:01 Caleb-T-Owens

Screenshot 2025-01-02 at 15 55 07

I'm getting these strange "string did not match the expected pattern" errors 🤔

Caleb-T-Owens avatar Jan 02 '25 15:01 Caleb-T-Owens

I made a mistake in the previous pr: https://github.com/gitbutlerapp/gitbutler/pull/5885

mtsgrd avatar Jan 02 '25 20:01 mtsgrd

Didn't work for me

Screenshot 2025-01-05 at 18 30 35

ivanglushko avatar Jan 05 '25 16:01 ivanglushko

Can you try pointing gitbutler at localhost or 127.0.0.1?

mtsgrd avatar Jan 05 '25 16:01 mtsgrd

Screenshot 2025-01-08 at 16 59 28

Can you try pointing gitbutler at localhost or 127.0.0.1?

ivanglushko avatar Jan 08 '25 14:01 ivanglushko

Argh. Let me properly debug this later today.

mtsgrd avatar Jan 08 '25 15:01 mtsgrd

Hello! Are there any updates on the issue?

I've been experimenting with the dev build to figure out what's going wrong. I came across this thread, which might be helpful. According to the comments, the bug seems to be somehow related to the explicit specification of the port.

I managed to work around this bug by following these steps:

  1. Add 127.0.0.1 ollama.local to /etc/hosts.
  2. Configure an Nginx server like this:
server {
    listen 80;
    server_name ollama.local;

    location / {
        proxy_pass http://127.0.0.1:11434;
        proxy_set_header Host $host;
        proxy_set_header X-Real-IP $remote_addr;
    }
}
  1. Add http://ollama.local to crates/gitbutler-tauri/capabilities/main.json.
  2. Update the Ollama endpoint from http://127.0.0.1:11434 to http://ollama.local.

It works now—well, about half the time. Obviously, this is not a solution, but perhaps it could help you solve the problem.

However, I occasionally encounter errors like this:

Invalid response: { "type": "object", "properties": { "result": "Update go.mod with cloud provider version" }, "required": [ "result" ], "additionalProperties": false }

It seems to me that the Ollama response isn’t always correct and doesn’t fully comply with the OLLAMA_CHAT_MESSAGE_FORMAT_SCHEMA, as there is no result field at the top level. The JSON returned by the LLM appears to be strange and doesn’t entirely make sense. This behavior seems expected, given that LLMs tend to hallucinate, especially smaller models like llama3.2 in my case.

Is there anything you can do to address this? Maybe the prompt could be adjusted to force the model to return plain text instead of JSON?

update: I’ve also managed to reproduce the error: “The string did not match the expected pattern”. This happens because of the following call, as well as inconsistencies in the Ollama output.

erryox avatar Jan 16 '25 13:01 erryox

Same issue and the above did not help. I did not run from source, so not sure it should work. Image

soundstep avatar Mar 03 '25 10:03 soundstep

+1 Experiencing the same error

satish-gsa avatar Mar 25 '25 11:03 satish-gsa

+1 Experiencing the same error. I can get it to work in dev mode, but as soon as I do a "nightly-like" build it gives the "string did not match expected pattern error"

Jels02 avatar Mar 27 '25 00:03 Jels02

I found out that GitButler (Version 0.14.16 (20250408.174627)) sends null origin header on MacOS for Ollama requests:

Image

Returning 403 response, Image

Unfortunately ollama cannot be set to allow null origin, and you must allow all origins. You can allow all origins using OLLAMA_ORIGINS env variable set to *, like so:

OLLAMA_ORIGINS="*" ollama serve

Would it be possible to send proper origin header for Ollama requests? These are allowed by default,

OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:* app://* file://* tauri://* vscode-webview://* vscode-file://*]

mochja avatar Apr 11 '25 14:04 mochja

Thank so much for looking into this!

This sounds like a simple adjustment, but @estib-vega would know more about it I am sure.

Byron avatar Apr 12 '25 13:04 Byron