ddrcrow

Results 10 comments of ddrcrow
trafficstars

I have the same issue but using ollama client 0.1.29 there is no /api/chat any more which will reply you 404 error what I notice from the sample of ollama,...

@better629 I missed 'not' in my last comment,that should be “BTW, we tried with the solution in your last comment, it **didnot** work for us, and the content_type was 'content_type...

@better629 chat works as well curl --verbose http://xxx.xxx.xxx.xxx:8091/api/chat -d '{ > "model": "llama2_7b_chat_q5km", > "messages": [ > { "role": "user", "content": "why is the sky blue?" } > ] >...

@better629 I have been using /api, never used /v1 the configuration did not work for me, but need the required api_key ========================== llm: api_type: 'ollama' base_url: 'http://127.0.0.1:11434/api' model: 'llama2' ===========================...

After gave fake api_key, failed again, btw, my ollama worked well with open webui. my mdetagpt is 0.7.6 (venv_3117) root@43e8ff60acbf:~# python **llm_hello_world.py** 2024-03-21 15:09:30.543 | INFO | metagpt.const:get_metagpt_package_root:29 - Package...

aha..., seems this issue was closed :(

OK, let me try in the weekend

@better629 I debugged it, OK, that's my fault for case-sensitivity in model name :(, it worked for the llm_hello_word samples. thanks for your help `(venv_3117) root@43e8ff60acbf:~/workspace/metagpt_test# python ./llm_hello_world_simple.py 2024-03-23 12:18:45.126...

@crazywoola you can close this issue, it worked for me now, as the dosubot said, I need set the NO_PROXY in ~/.docker/config.json