Jean-Jerome Levy
Jean-Jerome Levy
Hi, Your curl request should be: `curl --location 'http://127.0.0.1:5000/process_form' --form 'query="What does the author discuss about NFL"` Have you tried these other implementations?: [https://github.com/jmorganca/ollama?tab=readme-ov-file#web--desktop](https://github.com/jmorganca/ollama?tab=readme-ov-file#web--desktop) ?
With the @kingychiu hack, I've got `Error executing tool. Missing exact 3 pipe (|) separated values.` I had to add `Action Input should be formatted as coworker|task|context`. ``` allow_delegation=True, llm=Ollama(model="codellama:34b")...
same here 😢 (macOS): `ModuleNotFoundError: No module named 'torch'`
As far as I remember, these are points that start a variation of x% in the future [y0 = +x%]. X should be something between 2 and 7...
Same error : ``` node:internal/process/promises:394 triggerUncaughtException(err, true /* fromPromise */); ^ Error: String not found in file. Failed to apply edit. at uT (file:///opt/homebrew/lib/node_modules/@anthropic-ai/claude-code/cli.js:1636:728) at iR6 (file:///opt/homebrew/lib/node_modules/@anthropic-ai/claude-code/cli.js:1969:12134) at J (file:///opt/homebrew/lib/node_modules/@anthropic-ai/claude-code/cli.js:1969:11112)...
Not to mention that a single documentation page can easily feed an LLM using RAG, which can be an advantage in current times, especially when aiming to dive into new...
Hello, Thank you for reaching out and for highlighting this issue. I want to clarify that the Apple Silicon port of this project only supports the Gradio interface. The specific...