Missing grafana folder in Homebrew installation causing HolmesGPT to fail
Description:
I installed HolmesGPT locally using Homebrew (brew tap robusta-dev/homebrew-holmesgpt && brew install holmesgpt). However, when trying to use it with Ollama, I encountered the following error:
holmes ask "what pods are unhealthy in my cluster?" --model="ollama_chat/llama3.2" Failed to fetch holmes info robusta_client.py:23 ╭──────────────────────── Traceback (most recent call last) ────────────────────────╮ │ in ask:291 │ │ │ │ in create_console_toolcalling_llm:418 │ │ │ │ in create_console_tool_executor:301 │ │ │ │ in load_builtin_toolsets:81 │ │ │ │ in load_python_toolsets:58 │ │ │ │ in init:275 │ │ │ │ in _load_llm_instructions:375 │ │ │ │ in load_and_render_prompt:32 │ │ │ │ in load_prompt:22 │ ╰───────────────────────────────────────────────────────────────────────────────────╯ FileNotFoundError: [Errno 2] No such file or directory: '/opt/homebrew/Cellar/holmesgpt/0.10.4-alpha/libexec/_internal/holmes/plugins/toolsets/grafana/toolset_grafana_tempo.jinja2'
Upon checking, I found that the grafana folder is missing from the installed path:
/opt/homebrew/Cellar/holmesgpt/0.10.4-alpha/libexec/_internal/holmes/plugins/toolsets/
This causes the CLI tool to fail.
Steps to Reproduce:
- Install HolmesGPT via Homebrew:
brew tap robusta-dev/homebrew-holmesgpt && brew install holmesgpt
- Try running a query using Ollama:
holmes ask "what pods are unhealthy in my cluster?" --model="ollama_chat/llama3.2"
Observe the FileNotFoundError due to the missing grafana folder.
Environment:
OS: macOS (Apple Silicon)
Installation method: Homebrew
HolmesGPT version: 0.10.4-alpha
Ollama version: ollama_chat/llama3.2
Thanks @sourabhbhandari we are looking into it.
same for me
$ holmes version
0.10.4-alpha
@sourabhbhandari and @zrks, 0.10.5 is out and should solve this issue. Can you please confirm?