Amit Bhatnagar
Amit Bhatnagar
I had to use the export command again within WSL to set API base URL\Key etc. before it started working again.
I'm using a Google API key during fabric --setup command. I only provide Deepseek API information while setting environment. I haven't tried with Ollama yet even though I'm running it....
Only found this while ts..I tried with Ollama on my machine and yes, only got the API messge or connection error but never recieved a response. h__ps://knasmueller.net/running-fabric-locally-with-ollama
I was able to get it working with LM Studio though, the Open Source Models are not taking the commands properly. This is my .env file under /home//.config/fabric/ folder. I...
Same issue. I am using DeepSeek V2. OpenAI is way too expensive compared to these models and I would prefer to use these.
I sort of fixed it by modifying the code within PraisonAI python file to point to my own Custom AI baseURL and Model. Would be easier if we can change...
I'm not a coder but am in technical field so I used my thinking cap. I went inside Praisonai installation folder and changed auto.py file. It has mentioned of OpenAI...
Yes! It's all done on the fly when you use a prompt using Echo command followed by something like '| fabric --agents'
I was also able to get it working with LM Studio though, the Open Source Models are not taking the commands properly. This is my .env file under /home//.config/fabric/ folder....
Try what I did and modify the files directly. Look at my response until they fix it.