dspy icon indicating copy to clipboard operation
dspy copied to clipboard

DSPy: The framework for programming—not prompting—language models

Results 691 dspy issues
Sort by recently updated
recently updated
newest added

EnsembledProgram can now be loaded/saved ~~Inference of programs (forward function) is now concurrent~~ Added checks for size = 0 or size > number of programs

I don't believe this is possible. but unsure with the new backend refactor. I'd like to be able to handcraft and "freeze" a prompt template and have DSPY fill in...

Maintain backward compatibility with v3 client. Fix #787

from pydantic import BaseModel, Field class Input(BaseModel): speech: str = Field(description="speech text") class Output(BaseModel): extracted_words: list[str] = Field(description="a list of extracted words from speech") label: int = Field(ge=0, le=2, description="a...

https://pypi.org/project/dspy-ai/2.4.5/ does not match or include the latest changes to the project. It also does not match the 2.4.3 here: https://github.com/stanfordnlp/dspy/releases/tag/v2.4.3

Hopefully the new LM backend will allow us to make better use of models that are trained for "Chat". Below is a good example of how even good models like...

It seems like if you touch the file, commit requires you to fix all the ruff issues in that file, regardless of whether your changes are related to those lines...

Crashes with ValueError exception in predict.py - "Required 'max_tokens' or 'max_output_tokens' not specified in settings." if you use Bedrock with llama2 or llama3

Tested with: ```python from llama_cpp import Llama llm = Llama.from_pretrained( repo_id="TheBloke/OpenHermes-2.5-Mistral-7B-GGUF", filename="openhermes-2.5-mistral-7b.Q4_K_M.gguf", n_ctx=4096, n_gpu_layers=10, verbose=True ) llamalm = dspy.LlamaCpp(model="llama", llama_model=llm) dspy.settings.configure(lm=llamalm) def summarize_document(document): summarize = dspy.ChainOfThought('document -> summary') response =...