MetaGPT
MetaGPT copied to clipboard
Issue with Open Source models from oobabooga web UI OpenAi extension
Thanks for this amazing project.
I'm using OpenAI API extension from the oobabooga web UI https://github.com/oobabooga/text-generation-webui/tree/main/extensions/openai
Config: OPENAI_API_BASE: "http://127.0.0.1:5001/v1" OPENAI_API_KEY: "sk-1111111111111111111111111..." OPENAI_API_MODEL: "gpt-3.5-turbo-16k" MAX_TOKENS: 1500 RPM: 10
Here is the error when started with: ` python startup.py "create a simple snake game" 2023-08-29 12:56:45.452 | INFO | metagpt.config:init:44 - Config loading done. 2023-08-29 12:56:48.270 | INFO | metagpt.software_company:invest:39 - Investment: $3.0. 2023-08-29 12:56:48.270 | INFO | metagpt.roles.role:_act:167 - Alice(Product Manager): ready to WritePRD
Original Requirements
The boss wants to create a simple snake game. Warning: gpt-3.5-turbo may update over time. Returning num tokens assuming gpt-3.5-turbo-0613. 2023-08-29 12:56:52.091 | INFO | metagpt.provider.openai_api:update_cost:81 - Total running cost: $0.003 | Max budget: $3.000 | Current cost: $0.003, prompt_tokens: 842, completion_tokens: 14
Original Requirements
The boss wants to create a simple snake game. Warning: gpt-3.5-turbo may update over time. Returning num tokens assuming gpt-3.5-turbo-0613. 2023-08-29 12:56:55.309 | INFO | metagpt.provider.openai_api:update_cost:81 - Total running cost: $0.005 | Max budget: $3.000 | Current cost: $0.003, prompt_tokens: 842, completion_tokens: 14 Traceback (most recent call last): File "C:\Users\user\Desktop\AI\metagpt.venv\Lib\site-packages\tenacity-8.2.2-py3.11.egg\tenacity_asyncio.py", line 50, in call result = await fn(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\user\Desktop\AI\metagpt\metagpt\actions\action.py", line 62, in _aask_v1 instruct_content = output_class(**parsed_data) ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\user\Desktop\AI\metagpt.venv\Lib\site-packages\pydantic-1.10.8-py3.11.egg\pydantic\main.py", line 341, in init raise validation_error pydantic.error_wrappers.ValidationError: 9 validation errors for prd Original Requirements field required (type=value_error.missing) Product Goals field required (type=value_error.missing) User Stories field required (type=value_error.missing) Competitive Analysis field required (type=value_error.missing) Competitive Quadrant Chart field required (type=value_error.missing) Requirement Analysis field required (type=value_error.missing) Requirement Pool field required (type=value_error.missing) UI Design draft field required (type=value_error.missing) Anything UNCLEAR field required (type=value_error.missing)
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "C:\Users\user\Desktop\AI\metagpt\startup.py", line 42, in
Execute the following command to check whether OPENAI_API_BASE is correct.
openai -b "http://127.0.0.1:5001/v1" api chat_completions.create -m gpt-3.5-turbo -g user "Hello world"
Execute the following command to check whether OPENAI_API_BASE is correct.
openai -b "http://127.0.0.1:5001/v1" api chat_completions.create -m gpt-3.5-turbo -g user "Hello world"
It's working good but when using it with MetaGPT it doesn't work as expected and I'm not sure it's from the model it self or from the settings in web UI cause i tried many models most of them performed in bad way except vicuna it perform good for a while in the first phase (Requirements phase) only.
Jumping in here because I'm interested in using open-source model from oobabooga webui. However, from your comments here you are using the webui to interface with gpt-3.5, which isn't an open source model. Have you been able to get an open-source model to successfully run with MetaGPT?
Jumping in here because I'm interested in using open-source model from oobabooga webui. However, from your comments here you are using the webui to interface with gpt-3.5, which isn't an open source model. Have you been able to get an open-source model to successfully run with MetaGPT?
It's a drop-in replacement of the Original OpenAI API so the model name in the request doesn't change any thing as long as i know, as you notice I'm getting the original model that I'm using from the response which is "TheBloke_wizard....". Tried many solutions to make it work with MetaGPT but it didn't work as expected even though the API is working correctly.
I'm also very curious about this trying this our with Ollama or LocalAI to test codellama. Will share some feedback. CodeLLama looks so promising, looking forward to see how well it works
+1 for getting this to work with local OpenAI API endpoint. I have one running via FastChat and would love to use it for MetaGPT
FastChat built using the content in the help document is not available in metaGPT v0.7
After modifying config2.yaml in v0.7, it still does not work, and an error message appears
config2.yaml: llm: api_type: "open_llm" # or azure / ollama etc. openai base_url: "http://0.0.0.0:8000/v1" api_key: "a" #It cannot be removed here. If removed, an error will be reported model: "vicuna"
testing examples/write_novel.py get err
FastChat built using the content in the help document is not available in metaGPT v0.7
After modifying config2.yaml in v0.7, it still does not work, and an error message appears
config2.yaml: llm: api_type: "open_llm" # or azure / ollama etc. openai base_url: "http://0.0.0.0:8000/v1" api_key: "a" #It cannot be removed here. If removed, an error will be reported model: "vicuna"
testing examples/write_novel.py get err
I solved it. It's my problem