Ashpreet
Ashpreet
@balavenkatesh-ai how about: `base_url='https://.databricks.com/serving-endpoints/msbr-llama-2/invocations',` or `base_url='https://.databricks.com/serving-endpoints/msbr-llama-2/served-models/llama-2/invocations',` We haven't really tested with databricks LLM so this is a first for us, if this doesnt work @ysolanky we should create a new...
@balavenkatesh-ai if different variations of the base_url dont work, we'll just have to create the databricks LLM class which could take a week or two
@Pythonllmcoder is ollama running? Its required to generate the embeddings. The error is because the code couldnt generate the the embeddings. If `ollama pull llama3` or `ollama pull nomic-embed-text` doesnt...
@Joycoooi @Pythonllmcoder is this fixed on your side?
@Joycoooi @Pythonllmcoder closing if okay ?
please reopen if this persists
@Kyros5 ^
@Kyros5 is this fixed?
@Kyros5 for now can you do `TavilyTools(api_key=)` just to test it works this way?
@Kyros5 shall we close this?