twinny icon indicating copy to clipboard operation
twinny copied to clipboard

Twinny tries ollama URL for oobabooga embeddings

Open allo- opened this issue 1 year ago • 2 comments

Describe the bug I configured oobabooga as embedding provider:

  • Host: 127.0.0.1
  • Port: 5000
  • Path: /v1/embeddings
  • Model name: all-mpnet-base-v2

When I now select the provider and click "Embed workspace documents" vscode still requests http://0.0.0.0:11434/api/embed

To Reproduce Try to use oobabooga as embedding provider.

Expected behavior The configured provider, URL, port and path should be used..

API Provider Oobabooga

Chat or Auto Complete? Embedding

Model Name all-mpnet-base-v2

Desktop (please complete the following information):

  • OS: Linux

Additional context Chat works as expected with oobabooga (chat) provider.

allo- avatar Oct 01 '24 14:10 allo-

Hey, thanks for the report. However, I'm sorry but I cannot replicate this?

rjmacarthy avatar Oct 03 '24 18:10 rjmacarthy

I can test next week more. Anything I should start with?

What I did: I configured the embedding provider and then opened the panel and pressed the button for embedding the workspace. It didn't do anything and only showed a notification that didn't went away. Then I saw an error in the vscode developer tools (electron console) and patched a console.log for the URLs into the js file, and saw that it seems to use the oolama port. By listening there with netcat I was able to verify that it indeed tries port 11434 instead of the configured port.

I use ooba for chat and it works fine and I do not have a FIM model configured yet.

allo- avatar Oct 03 '24 18:10 allo-

Thanks for the detailed response. Please could you let me know what version you are using?

rjmacarthy avatar Oct 09 '24 14:10 rjmacarthy

twinny-3.17.20-linux-x64

allo- avatar Oct 09 '24 14:10 allo-

Had the same issue with embedding with twinny <-> LM Studio. At end forked the repo and commented-out lines 157-160 in src/extension/provider-manager.ts and use local extension build - now documents ingestion and embedding requests runs perfectly with LM Studio. I'm still not sure it works in chat, though... @rjmacarthy - plz take note that commented code overwrites user-defined path

vkx86 avatar Oct 09 '24 15:10 vkx86

The question is whether the default settings shouldn't be overwritten anyway when I configure an own provider.

allo- avatar Oct 09 '24 15:10 allo-

Hey, sorry about this bug. I thought I'd removed that code in a previous version, I just released version v3.17.24 which should address it.

Many thanks,

rjmacarthy avatar Oct 10 '24 07:10 rjmacarthy

I still have the problem with v3.17.24.

allo- avatar Oct 10 '24 12:10 allo-

I think the problem may start somewhere else. When I select the embedding provider after I clicked the cylinder icon, it switched back to blank after a few seconds. It neither tries to reach the ooba port nor the ollama port.

I think trying to embed without a selected provider then defaults to ollama.

Under what conditions is the selection of the embedding provider changed? Are there some checks that fail for the configured provider? I wonder what may be checked, because there is no http request that could fail in the time from selecting the ooba provider and twinny switching back to the empty item in the select box.

allo- avatar Oct 18 '24 12:10 allo-

Is the fix in 3.21.18? I can test tomorrow.

allo- avatar Jan 27 '25 18:01 allo-