laurentboutet

Results 3 comments of laurentboutet

How can we configure the URL and port of the Ollama? By default, it tests localhost and default port 11434. I need to send to my ollama server. No way...

Hello, I found. It is outside the solution in another package. I resolve by this way https://github.com/langchain-ai/langchain/issues/15365 But it would be confortable to be able to manage it directly thanks

I mean have the possibility to configure the URL and port in the configuration or with environnement in ps-fuzz.