phidata
phidata copied to clipboard
need enhancement for LLM response of function calling embedded in markdown
Scenario: function calling Steps:
- run cookbook under "cookbook\llms\ollama\tools\app.py" using "streamlit run app.py"
- select llama3 model
- ask question about stock price of some company, like APPLE or GOOGLE
Problem: Sometimes LLM responses raw JSON text, and sometimes it embeds the reponse in markdown, like
{
....
}
or
{
.....
}
Either case could not be handled correctly in current code.
So is it possible or necessary to enhance the cookcode to handle such case? thanks.
yup working on this :)
I have same problem, I think it is because one parameter should be chosen: tool_choice <Union[str, Dict[str, Any]]>. On the document it is said that if there is tool defined, it is set to "auto" as default value. this means the AI robot can choose from answering a message or using a tool. So, under this conditon, from the documentation, this parameter seemly should be specified, but the pattern is defined blurry, hard to write right code.
@ju1987yetchung @1WorldCapture
Please try our new ollama agent class:
phidata\cookbook\providers\ollama
"""Run `pip install yfinance` to install dependencies."""
from phi.agent import Agent
from phi.model.ollama import Ollama
from phi.tools.yfinance import YFinanceTools
agent = Agent(
model=Ollama(id="llama3.2"),
tools=[YFinanceTools(stock_price=True, analyst_recommendations=True, stock_fundamentals=True)],
show_tool_calls=True,
description="You are an investment analyst that researches stocks and helps users make informed decisions.",
instructions=["Use tables to display data where possible."],
markdown=True,
)
# agent.print_response("Share the NVDA stock price and analyst recommendations", stream=True)
agent.print_response("Summarize fundamentals for TSLA", stream=True)
Let us know how it works!