dspy icon indicating copy to clipboard operation
dspy copied to clipboard

DSPy: The framework for programming—not prompting—language models

Results 691 dspy issues
Sort by recently updated
recently updated
newest added

# Description Fix #1373 but a discussion might need to be held with authors to clarify their expected dataflow. # Content Fix a typo, the extraction of `metadatas` and try...

# Description While following the tutorial [[02] Multi-Hop Question Answering](https://dspy-docs.vercel.app/docs/tutorials/simplified-baleen), adapted to my system (that is to say a LM hosted and provided by a local Ollama server and a...

Enable performing mipro prompt optimization while only using labeled demos. Here, bootstrapped demos are solely used for the meta-prompt to enhance the final prompt. Before fix: - max_bootstrapped_demos = 0...

# Description While following the tutorial [`[01] RAG: Retrieval-Augmented Generation`](https://dspy-docs.vercel.app/docs/tutorials/rag), adapted to my system (that is to say a LM hosted and provided by a local Ollama server and a...

Hi DSPy team, I'm trying to understand how MIPRO functions and have a few questions. 1. Are optimized instructions kept in `best_program.trial_logs`? In my experiments with GPT3.5, the optimized instructions...

This fixes the issue that originally, `backtrack_handler()` clears `dsp.settings.trace`, which is also used by optimizers such as `BootstrapFewShot`. Also some ruff linting of `tests/predict/test_retry.py`. Fixes #1356

I run the official code example in `intro.ipynb`: ```python import dspy lm = dspy.LM(model='openai/default', api_key=" ", api_base=" ",temperature=0.9, max_tokens=3000,) colbertv2_wiki17_abstracts = dspy.ColBERTv2(url='http://20.102.90.50:2017/wiki17_abstracts') dspy.settings.configure(lm=lm, rm=colbertv2_wiki17_abstracts) from dspy.datasets import HotPotQA # Load...

Hey dspy team, I am trying to use langfuse tracer to trace my usage and most importantly capture cost. The place where I am struggling at the moment is getting...

When using ProgramOfThought and code is generated with named conditionals like this: ```python Previous Code: from datetime import datetime data = { "ExpenseReport": "002", "ExpenseAmount": "USD30.0", "AuthorizationLevel": "94", "ExpenseDate": "2024-01-17",...

Uses the "LITELLM_LOCAL_MODEL_COST_MAP" variable to turn off an API call when liteLLM is imported. Set 'litellm.telemetry = False' as well. https://docs.litellm.ai/docs/completion/token_usage#9-register_model https://github.com/BerriAI/litellm/blob/main/litellm/__init__.py