dspy
dspy copied to clipboard
Predictors with Ollama return input prompt in answer
DSPy versions I checked: 2.4.0 and 2.4.12 Ollama version: 0.1.9
I use model served with ollama (I tried it with both "chat" and "text")
import dspy
model = dspy.OllamaLocal(model='aya', model_type="chat")
When I use it explicitly it just returns the generated text
model("what is two plus two?")
> ['The answer is four.']
But when I try to use Predict it adds the prompt to the answer:
generator = dspy.Predict("question -> answer")
generator(question="What is two plus two?").answer
>'Question: What is two plus two?\nAnswer: Four.'
Is this a bug or is there some kind of option that can be tweaked for this not to happen? I used Ollama models a month ago and it wasn't an issue.
Use dspy.configure(experimental=True) for chat LMs
That didn't help. BTW I tried it with mistral and it's wrong but in a different way
Prediction(
answer='Answer: Four.'
)
I also tried with both "text" and "chat", the results are the same.
That seems like an error with parsing outputs, which classes handle this?
Probably resolved now if you migrate to 2.5: https://github.com/stanfordnlp/dspy/blob/main/examples/migration.ipynb