dspy icon indicating copy to clipboard operation
dspy copied to clipboard

Predictors with Ollama return input prompt in answer

Open lambdaofgod opened this issue 1 year ago • 2 comments
trafficstars

DSPy versions I checked: 2.4.0 and 2.4.12 Ollama version: 0.1.9

I use model served with ollama (I tried it with both "chat" and "text")

import dspy

model = dspy.OllamaLocal(model='aya', model_type="chat")

When I use it explicitly it just returns the generated text

model("what is two plus two?")
> ['The answer is four.']

But when I try to use Predict it adds the prompt to the answer:

generator = dspy.Predict("question -> answer")
generator(question="What is two plus two?").answer
>'Question: What is two plus two?\nAnswer: Four.'

Is this a bug or is there some kind of option that can be tweaked for this not to happen? I used Ollama models a month ago and it wasn't an issue.

lambdaofgod avatar Jul 22 '24 10:07 lambdaofgod

Use dspy.configure(experimental=True) for chat LMs

okhat avatar Jul 23 '24 13:07 okhat

That didn't help. BTW I tried it with mistral and it's wrong but in a different way

Prediction(
    answer='Answer: Four.'
)

I also tried with both "text" and "chat", the results are the same.

That seems like an error with parsing outputs, which classes handle this?

lambdaofgod avatar Jul 23 '24 21:07 lambdaofgod

Probably resolved now if you migrate to 2.5: https://github.com/stanfordnlp/dspy/blob/main/examples/migration.ipynb

okhat avatar Sep 25 '24 19:09 okhat