dspy
dspy copied to clipboard
Working with HFModels locally
I am trying to use the dspy.HFModel object to load models locally. I keep getting
ValueError: temperature has to be a strictly positive float, but is 0.0
I tried to set temperature while setting configuration as follows
dspy.settings.configure(lm=llama, temperture=0.7)
But without any success
Thank you @EngSalem . Could you try TGI instead? HFClientTGI is the main way to use local models at the moment.
Is there a way to instantiate TGI servers with a quantized version? The current llama-13b instance is throwing errors whenever I try to identify a new signature module
Honestly I'd love to be able to support that. We also have VLLM support.
I don't know how to work with quantized models but if we figure it out, it would be great for us and everyone :D
Thank you! I will try to see if I can do a workaround and if I can find anything I will update this thread.
I am trying to use the dspy.HFModel object to load models locally. I keep getting
ValueError: temperature has to be a strictly positive float, but is 0.0I tried to set temperature while setting configuration as follows
dspy.settings.configure(lm=llama, temperture=0.7)But without any success
Hi i have same problem, do you solve this?
@IcyFeather233 It's fixed in the DSPy version that's in main, but there hasn't been a release yet. So you could clone and install from the repo. Maybe it's enough to disable sampling, this is that fix that has been merged for this:
if 'temperature' in kwargs and kwargs['temperature'] == 0.0:
kwargs['do_sample'] = False
@IcyFeather233 It's fixed in the DSPy version that's in main, but there hasn't been a release yet. So you could clone and install from the repo. Maybe it's enough to disable sampling, this is that fix that has been merged for this:
if 'temperature' in kwargs and kwargs['temperature'] == 0.0: kwargs['do_sample'] = False
Thanks! I use pip install -Ue . to install from repo and it's been solved!