litellm
litellm copied to clipboard
[Bug]: aembedding parameter "api_base" is not working when using model with "ollama"
What happened?
litellm and ollama in different docker container while running sample code below, param api_base is not working with error print "litellm.exceptions.APIConnectionError: Cannot connect to host localhost:11434 ssl:default [Connect call failed ('127.0.0.1', 11434)]"
import asyncio
#from litellm import embedding
from litellm import aembedding
async def test_gen_embedding():
return await aembedding(
model="ollama/nomic-embed-text:v1.5",
input=["test litellm ollama nomic-embed-text v1.5"],
api_base="http://ollama:11434",
aembedding=True
)
if __name__ == "__main__":
#resp = embedding(
# model="ollama/nomic-embed-text:v1.5",
# input=["test litellm ollama nomic-embed-text v1.5"],
# api_base="http://ollama:11434",
# aembedding=True
# )
resp = asyncio.run(test_gen_embedding())
print(resp)
print("-- done --")
seems ollama.ollama_aembeddings(...) is called without passing parameter "api_base" , check file litellm/main.py:2782
if aembedding == True:
response = ollama.ollama_aembeddings(
model=model,
prompt=ollama_input,
encoding=encoding,
logging_obj=logging,
optional_params=optional_params,
model_response=EmbeddingResponse(),
)
Relevant log output
Traceback (most recent call last):
File "/tmp/share/test-litellm-2/test_litellm_embed.py", line 24, in <module>
resp = asyncio.run(test_gen_embedding())
File "/usr/lib/python3.11/asyncio/runners.py", line 190, in run
return runner.run(main)
File "/usr/lib/python3.11/asyncio/runners.py", line 118, in run
return self._loop.run_until_complete(task)
File "/usr/lib/python3.11/asyncio/base_events.py", line 653, in run_until_complete
return future.result()
File "/tmp/share/test-litellm-2/test_litellm_embed.py", line 9, in test_gen_embedding
return await aembedding(
File "/tmp/share/test-litellm-2/lib/python3.11/site-packages/litellm/utils.py", line 3238, in wrapper_async
raise e
File "/tmp/share/test-litellm-2/lib/python3.11/site-packages/litellm/utils.py", line 3069, in wrapper_async
result = await original_function(*args, **kwargs)
File "/tmp/share/test-litellm-2/lib/python3.11/site-packages/litellm/main.py", line 2430, in aembedding
raise exception_type(
File "/tmp/share/test-litellm-2/lib/python3.11/site-packages/litellm/utils.py", line 8265, in exception_type
raise e
File "/tmp/share/test-litellm-2/lib/python3.11/site-packages/litellm/utils.py", line 8240, in exception_type
raise APIConnectionError(
litellm.exceptions.APIConnectionError: Cannot connect to host localhost:11434 ssl:default [Connect call failed ('127.0.0.1', 11434)]
Twitter / LinkedIn details
No response
This must be fixed now after #2675.
@krrishdholakia Seems like this issue is already fixed and ready to be closed