Flowise
Flowise copied to clipboard
[BUG] Ollama answer in the chat cannot be parsed
Describe the bug
I am using Ollama latest and tinyllama.
with a very simple chatflow in Flowise:
and this is the result in the test chat:
But when I enter in cmd with Ollama and ask the same question everything is super fine:
To Reproduce Steps to reproduce the behavior:
Setup the flow and try to chat
Expected behavior To have as minimum the same recipe for scrambled eggs :) Same answer as cmd one.
Screenshots Applied above
Flow Applied above
Setup
- Docker setup
- Flowise Version 1.6.0
- OS: Linux Leap 15.4
- Browser Chrome
- Ollama version 0.1.29
Additional context Add any other context about the problem here.
Click additional parameters and remove the system message and prompt etc. only put the user input and response nothing else.
Does it fix it?
running from command line vs using conversational agent is different. Conversational agent has built-in prompt that ask the LLM to give data in JSON format, and its tailored to ChatGPT. For other models, try using LLMChain
Thank you, now it is ok!