Florian Dewes
Florian Dewes
I am not sure tbh, just wanted to play around with it. The localai docs explicitly mention llama.cpp embeddings (here: https://localai.io/features/embeddings/) and the code below creates embeddings with the phi...
Ah nice , thank you :) I will try some of the linked models for embeddings. Both links were unknown to me until now. I have one more question concerning...
You can store them in a local directory and then mount it in the container like below (mnt/llm/models is the local directory) However, you'll need config files fror that as...
Hmm, I tried to reproduce your code above to get the "sentence-t5-large" embeddings. However, I am getting the following error: ``` InternalServerError: Error code: 500 - {'error': {'code': 500, 'message':...