transformerlab-app icon indicating copy to clipboard operation
transformerlab-app copied to clipboard

Loading Ollama models fail to load a default server

Open delfireinoso opened this issue 9 months ago • 2 comments

MacOS 15.3 Transformer.Lab updated, running on the machine where the models are located

When we load a Local Ollama model on Foundation, no server is loaded by default, resultan on error when running the model

The ollama model is working on ollama (macOS app) and with OpenWebUi or Perplexica running on Docker in the same machine

Selecting manually ollama server vives a normal behaviour

delfireinoso avatar Feb 26 '25 21:02 delfireinoso

Hi @delfireinoso, Sorry for completely missing this issue, we will test this out and get back on how we can solve this one.

deep1401 avatar Apr 15 '25 18:04 deep1401

Oh hmmm I thought this issue was the same as this one: https://github.com/transformerlab/transformerlab-app/issues/329

But htat is possibly not the case. But I think fixing the other issue will also fix this one.

dadmobile avatar Apr 15 '25 23:04 dadmobile