Natan Yellin
Natan Yellin
@mrchocha not exactly! The issue here is a bit tricker and its about the actual output from tools exceeding context windows. Its probably not the best beginner issue to look...
Hi @sebv004, we're looking into this. What model are you using?
Hi @goyamegh, we're working on improving HolmesGPT's trace functionality across the board. As part of that, we're going to improve documentation of course - and we're also looking for people...
Hi all, we fixed a number of issues related to this and ollama should work on current holmes versions. If you encounter an issue on the latest version feel free...
@styladj1 anything we can fix in the docs to make it more clear to the next person who needs to do this?
I fixed a number of issues with `holmes version` in #893. Before testing please make sure you uninstall holmes and don't have any old `holmes` binaries from other sources on...
@Munken thanks for reporting, what is your python version?
Thanks for reporting, I'm looking into this! I suspect its an issue with the prompt-caching feature we recently added.
Can you please check this branch and see if it fixes the problem? https://github.com/robusta-dev/holmesgpt/pull/999