walker
walker copied to clipboard
Add integrated LLM chat mode to Walker
Describe the feature Add an optional built-in LLM chat mode inside Walker. Users could type natural-language questions (e.g., “What’s the weather?”, “Explain recursion”, “Summarize this text”) directly into the launcher and get answers from an LLM.
This feature would make Walker a lightweight, fast-access AI assistant — no need to open a separate browser or app.
The LLM backend should be configurable, supporting:
- Local models (like Ollama, llama.cpp, etc.)
- Remote APIs (e.g., OpenAI, Anthropic, etc.)
Alternatives
- Continue using external chat apps (browser-based or terminal-based LLM clients).
- Use a plugin system, if available, to implement chat via an external extension.
Describe the behaviour
- Activation:
- User opens Walker as usual.
- Presses a hotkey or enters a trigger (e.g., /ai or ?) to enter Chat Mode.
- Input:
- User types a natural-language prompt, e.g.:
- “/ai What is the fastest sorting algorithm?”
- “Explain the difference between TCP and UDP.”
- Processing:
- Walker sends the prompt to the configured LLM backend.
- The backend could be local (Ollama, LM Studio) or API-based (OpenAI, etc.), depending on user settings.
- Output:
- The model’s response is displayed directly in Walker — either inline or in a small popup/chat pane.
- User can press Shift+Enter (or similar) to continue the conversation contextually.
- Optionally allow copying the response with one key.
- Settings:
- Users can configure:
- Model backend and endpoint
- API key (if needed)
- Response length / temperature
- Whether chat history is saved or cleared on close
Or maybe no? Maybe an app launcher shouldn't be an AI assistant?
Or maybe no? Maybe an app launcher shouldn't be an AI assistant?
walker used to have that feature