dspy
dspy copied to clipboard
DSPy: The framework for programming—not prompting—language models
Specifically I need this to allow setting the torch dtype so I can load models that only just fit in my VRAM. Let me know if any extra tweaks are...
Local models are often sensitive to be prompted with their own prompt template. For example, for Mistral models that would be: ``` [INST] {prompt} [/INST] ``` What's the mechanism a...
Is it possible for validation metrics to return non-boolean answers? This could bring a lot more control over the optimization strategy, e.g. loss weighting for different metrics. Another idea is...
# Context The "chat-completion-API" where a model receives a `system prompt` and a list of `messages` assigned to different `roles` is gaining traction. OpenAI's chat API is driven by this...
BootstrapFewShotWithOptuna currently converts demostration examples to dicts, which breaks line 210 in template_v2.py as `augmented` cannot be accessed as an attribute. I tried converting to dotdict at the line instead,...
Dropping the prompt from the model output is necessary to be able to correctly retrieve an output from the Prediction object, but this is only done in HFModel when a...
-updated DSPy init with vLLM support -added vLLM to hf_client -added docs for local model testing with HFModel, TGI, vLLM, and MLC -added model initialization validation for HFClientTGI within latest...
Integrates [langfuse](https://langfuse.com/) to provide observability on DSPy program executions. The newly added docs contain more information on how to execute this.
Hey, while inspecting the compiled save json I was expecting to see the compiled prompts. I can see only "demos" and other fields just empty. I am confused since the...
I tried to use Program of thought in the intro notebook after the chain of thought cell, but I get the following error: ``` # Define the predictor. Notice we're...