Dominic
Dominic
exceeded context length of the model in question its trying to get message history for the last 5 messages, and the total of those exceeded the 128k context length
Not sure what skyvern is doing, can you give an example of what the prompt is like?
> > > @rcmorano Qwen2-VL does not work with Ollama or llama.cpp yet afaik. > > > > > > Ollama supports inference of custom models through the ‘ollama create’...
Def drop a new link in here if you open a new pr! Can’t wait to see ollama powering skyvern.
> > can I get information about this development? Is it still being worked on in any other branch? when I try, I get the following errors and I can't...