llm-prompt-debugger
llm-prompt-debugger copied to clipboard
Clean UI for LLM development workflows with prompt versioning and model selection. Built for engineers, not hype. Streamlined prompt → model → tag → export workflow. Currently supports OpenAI, Claude,...
A developer-first UI for testing, tagging, and exporting LLM prompts. With built-in support for OpenAI, Claude, and Ollama.
About
LLM Prompt Debugger is a playground for evaluating and labeling LLM outputs.
Features:
- Prompt input + response viewing
- Model selection (OpenAI, Claude, Ollama)
- Tagging UI for prompt categorization
- JSON + Markdown export support
- Hotkey:
Cmd+EnterorCtrl+Enterto run
Getting Started
ℹ️ Requires
Node.js 18+andpnpm
If you don’t have pnpm installed:
npm install -g pnpm
Then clone and run the project locally:
git clone https://github.com/Cre4T3Tiv3/llm-prompt-debugger.git
cd llm-prompt-debugger
pnpm install
pnpm dev
Visit: http://localhost:3000
Lockfile Strategy
This project uses a pnpm-lock.yaml file to ensure deterministic installs across contributors and CI environments.
- Use
pnpmto install dependencies and preserve the lockfile - If you prefer
npmoryarn, deletepnpm-lock.yamlbefore runninginstall - Officially supported:
pnpm(fast, efficient, and CI-friendly)
Tagging System
Apply semantic and stylistic tags to each prompt-response pair.
Built-in tags:
code,debug,refactor,summarization,technical,marketing,LLM,simulationtone:professional,tone:casual,tone:funny,tone:neutral
Custom tags are supported via input field.
Exporting
Export history to:
- JSON for programmatic analysis
- Markdown for docs or knowledge sharing
ℹ️ Markdown output is grouped by model and time-stamped
Model Support
| Provider | Example Model | Usage Notes |
|---|---|---|
| OpenAI | gpt-4, gpt-4o |
Requires OPENAI_API_KEY |
| Anthropic | claude-3-opus |
Requires CLAUDE_API_KEY |
| Ollama | llama3 |
Local model support |
Set these API keys in .env.local
End-to-End Usage Guide
Looking to test prompts from start to finish?
See the full walkthrough for testing, tagging, exporting, and sharing prompts across supported LLM providers:
E2E-GUIDE.md
Deployment
To deploy statically:
pnpm build
pnpm start
Supports Vercel, Netlify, Docker, and self-hosting.
Contributing
PRs are welcome! Open an issue or discussion to propose ideas.
See CONTRIBUTOR.md for setup and guidelines.
Maintainer
Built with ❤️ by @Cre4T3Tiv3 at ByteStack Labs
License
MIT – © 2025 @Cre4T3Tiv3
⚠️ Known Installation Warnings
This project includes some development dependencies with upstream deprecation warnings (e.g., [email protected], [email protected]). These are non-breaking and safe to ignore.
For detailed context and updates:
KNOWN-WARNINGS.md