FinRobot icon indicating copy to clipboard operation
FinRobot copied to clipboard

ollama models

Open martinjaco opened this issue 1 year ago • 3 comments

How to make use of ollama models instead of Openai models?

martinjaco avatar Jun 07 '24 22:06 martinjaco

As far as I know, most of the open-source models supported by ollama do not support tool usage. But if you find one, you can modify the openai api as per this doc.

boyuZh avatar Jun 12 '24 01:06 boyuZh

The Agent stuff totally leverage the LLM model functionality, such as function calling. If you want to try ollama, check the model`s function calling. Most of them are not good enough.

zhengr avatar Jun 26 '24 04:06 zhengr

There are a few models that support tool usage on Ollama now, such as Llama 3.1, Llama 3.2, Mixtral, and Command-R. The list can be found here

You can replace the OpenAI config snippet with this to use llama models

config_list = [
    {
        "model": "llama3.1",
        "base_url": "http://127.0.0.1:11434/v1",
        "api_key": "ollama",
    }
]

llm_config = {"config_list": config_list, "timeout": 120, "temperature": 0}

I'll echo what @zhengr wrote though. In my limited experience, I found the OpenAI models to be much smoother. Specifically in the forecaster, the llama and mixtral models weren't consistently running successfully.

amiles2233 avatar Oct 22 '24 12:10 amiles2233