Amit Bhatnagar
Amit Bhatnagar
Yes. Fabric folder shows a test.yaml file that has all the PraisonAI agents needed for the request to work.
Let me try and get back to you.
I have tried both .env file and below setting within UI but same error. Where exactly are you making the changes especially about the Base URL? 
Unfortunately, Same!!  
It worked. Instead of Ollama, I provided the base URL of Deepseek. Again, while working with the Docker, we should be able to provide the Base URL of the API...
Have you tried doing it over clear http rather than ssl. Try removing the certificate for testing purposes. That being said, I had better luck with LMStudio rather than Ollama....
I'd say go with WSL Get a feel of Linux of not already. It also gives you the flexibility to run other programs that isn't just possible on Windows due...
Try pip install knowledge-storm
What if you use a dummy key?