llama-stack
llama-stack copied to clipboard
llama3.2 gmailtoolkit problem
System Info
llama = ChatOpenAI(api_key="ollama", model="llama3.2:latest", base_url="http://127.0.0.1:11434/v1")
I found a bug when calling the GmailToolkit module using the ollama model llama3.2. The model tool_calls message should be 'to': ['[email protected]']}, but 'to' was accidentally called: '["[email protected]"]'}
So, I don't know if this bug is a model problem or a toolkit module problem, so this is the only feedback
Information
- [ ] The official example scripts
- [ ] My own modified scripts
🐛 Describe the bug
如上
Error logs
如上
Expected behavior
如上
llama = ChatOpenAI(api_key="ollama", model="llama3.2:latest", base_url="http://127.0.0.1:11434/v1")
This seems to be using Langchain. Could you share the detailed reproduce steps?
does not seem like a llama-stack issue, please feel free to open a new one if you continue to see this.