MetaGPT icon indicating copy to clipboard operation
MetaGPT copied to clipboard

Issue with Open Source models from oobabooga web UI OpenAi extension

Open xb3sox opened this issue 1 year ago • 8 comments

Thanks for this amazing project.

I'm using OpenAI API extension from the oobabooga web UI https://github.com/oobabooga/text-generation-webui/tree/main/extensions/openai

Config: OPENAI_API_BASE: "http://127.0.0.1:5001/v1" OPENAI_API_KEY: "sk-1111111111111111111111111..." OPENAI_API_MODEL: "gpt-3.5-turbo-16k" MAX_TOKENS: 1500 RPM: 10

Here is the error when started with: ` python startup.py "create a simple snake game" 2023-08-29 12:56:45.452 | INFO | metagpt.config:init:44 - Config loading done. 2023-08-29 12:56:48.270 | INFO | metagpt.software_company:invest:39 - Investment: $3.0. 2023-08-29 12:56:48.270 | INFO | metagpt.roles.role:_act:167 - Alice(Product Manager): ready to WritePRD

Original Requirements

The boss wants to create a simple snake game. Warning: gpt-3.5-turbo may update over time. Returning num tokens assuming gpt-3.5-turbo-0613. 2023-08-29 12:56:52.091 | INFO | metagpt.provider.openai_api:update_cost:81 - Total running cost: $0.003 | Max budget: $3.000 | Current cost: $0.003, prompt_tokens: 842, completion_tokens: 14

Original Requirements

The boss wants to create a simple snake game. Warning: gpt-3.5-turbo may update over time. Returning num tokens assuming gpt-3.5-turbo-0613. 2023-08-29 12:56:55.309 | INFO | metagpt.provider.openai_api:update_cost:81 - Total running cost: $0.005 | Max budget: $3.000 | Current cost: $0.003, prompt_tokens: 842, completion_tokens: 14 Traceback (most recent call last): File "C:\Users\user\Desktop\AI\metagpt.venv\Lib\site-packages\tenacity-8.2.2-py3.11.egg\tenacity_asyncio.py", line 50, in call result = await fn(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\user\Desktop\AI\metagpt\metagpt\actions\action.py", line 62, in _aask_v1 instruct_content = output_class(**parsed_data) ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\user\Desktop\AI\metagpt.venv\Lib\site-packages\pydantic-1.10.8-py3.11.egg\pydantic\main.py", line 341, in init raise validation_error pydantic.error_wrappers.ValidationError: 9 validation errors for prd Original Requirements field required (type=value_error.missing) Product Goals field required (type=value_error.missing) User Stories field required (type=value_error.missing) Competitive Analysis field required (type=value_error.missing) Competitive Quadrant Chart field required (type=value_error.missing) Requirement Analysis field required (type=value_error.missing) Requirement Pool field required (type=value_error.missing) UI Design draft field required (type=value_error.missing) Anything UNCLEAR field required (type=value_error.missing)

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "C:\Users\user\Desktop\AI\metagpt\startup.py", line 42, in fire.Fire(main) File "C:\Users\user\Desktop\AI\metagpt.venv\Lib\site-packages\fire-0.4.0-py3.11.egg\fire\core.py", line 141, in Fire component_trace = _Fire(component, args, parsed_flag_args, context, name) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\user\Desktop\AI\metagpt.venv\Lib\site-packages\fire-0.4.0-py3.11.egg\fire\core.py", line 466, in _Fire component, remaining_args = _CallAndUpdateTrace( ^^^^^^^^^^^^^^^^^^^^ File "C:\Users\user\Desktop\AI\metagpt.venv\Lib\site-packages\fire-0.4.0-py3.11.egg\fire\core.py", line 681, in _CallAndUpdateTrace component = fn(*varargs, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\user\Desktop\AI\metagpt\startup.py", line 38, in main asyncio.run(startup(idea, investment, n_round, code_review, run_tests)) File "C:\Users\user\AppData\Local\Programs\Python\Python311\Lib\asyncio\runners.py", line 190, in run return runner.run(main) ^^^^^^^^^^^^^^^^ File "C:\Users\user\AppData\Local\Programs\Python\Python311\Lib\asyncio\runners.py", line 118, in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\user\AppData\Local\Programs\Python\Python311\Lib\asyncio\base_events.py", line 653, in run_until_complete return future.result() ^^^^^^^^^^^^^^^ File "C:\Users\user\Desktop\AI\metagpt\startup.py", line 24, in startup await company.run(n_round=n_round) File "C:\Users\user\Desktop\AI\metagpt\metagpt\software_company.py", line 60, in run await self.environment.run() File "C:\Users\user\Desktop\AI\metagpt\metagpt\environment.py", line 67, in run await asyncio.gather(*futures) File "C:\Users\user\Desktop\AI\metagpt\metagpt\roles\role.py", line 240, in run rsp = await self._react() ^^^^^^^^^^^^^^^^^^^ File "C:\Users\user\Desktop\AI\metagpt\metagpt\roles\role.py", line 209, in _react return await self._act() ^^^^^^^^^^^^^^^^^ File "C:\Users\user\Desktop\AI\metagpt\metagpt\roles\role.py", line 168, in _act response = await self._rc.todo.run(self._rc.important_memory) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\user\Desktop\AI\metagpt\metagpt\actions\write_prd.py", line 145, in run prd = await self.aask_v1(prompt, "prd", OUTPUT_MAPPING) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\user\Desktop\AI\metagpt.venv\Lib\site-packages\tenacity-8.2.2-py3.11.egg\tenacity_asyncio.py", line 88, in async_wrapped return await fn(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\user\Desktop\AI\metagpt.venv\Lib\site-packages\tenacity-8.2.2-py3.11.egg\tenacity_asyncio.py", line 47, in call do = self.iter(retry_state=retry_state) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\user\Desktop\AI\metagpt.venv\Lib\site-packages\tenacity-8.2.2-py3.11.egg\tenacity_init.py", line 326, in iter raise retry_exc from fut.exception() tenacity.RetryError: RetryError[<Future at 0x1ed71f46c50 state=finished raised ValidationError>]`

xb3sox avatar Aug 29 '23 10:08 xb3sox

Execute the following command to check whether OPENAI_API_BASE is correct.

openai -b "http://127.0.0.1:5001/v1" api chat_completions.create -m gpt-3.5-turbo -g user "Hello world"

voidking avatar Aug 30 '23 11:08 voidking

Execute the following command to check whether OPENAI_API_BASE is correct.

openai -b "http://127.0.0.1:5001/v1" api chat_completions.create -m gpt-3.5-turbo -g user "Hello world"

It's working good but when using it with MetaGPT it doesn't work as expected and I'm not sure it's from the model it self or from the settings in web UI cause i tried many models most of them performed in bad way except vicuna it perform good for a while in the first phase (Requirements phase) only.

image image

xb3sox avatar Aug 30 '23 18:08 xb3sox

Jumping in here because I'm interested in using open-source model from oobabooga webui. However, from your comments here you are using the webui to interface with gpt-3.5, which isn't an open source model. Have you been able to get an open-source model to successfully run with MetaGPT?

norton-chris avatar Aug 30 '23 19:08 norton-chris

Jumping in here because I'm interested in using open-source model from oobabooga webui. However, from your comments here you are using the webui to interface with gpt-3.5, which isn't an open source model. Have you been able to get an open-source model to successfully run with MetaGPT?

It's a drop-in replacement of the Original OpenAI API so the model name in the request doesn't change any thing as long as i know, as you notice I'm getting the original model that I'm using from the response which is "TheBloke_wizard....". Tried many solutions to make it work with MetaGPT but it didn't work as expected even though the API is working correctly.

xb3sox avatar Aug 31 '23 09:08 xb3sox

I'm also very curious about this trying this our with Ollama or LocalAI to test codellama. Will share some feedback. CodeLLama looks so promising, looking forward to see how well it works

samuelmukoti avatar Sep 03 '23 00:09 samuelmukoti

+1 for getting this to work with local OpenAI API endpoint. I have one running via FastChat and would love to use it for MetaGPT

itlackey avatar Sep 21 '23 17:09 itlackey

FastChat built using the content in the help document is not available in metaGPT v0.7

After modifying config2.yaml in v0.7, it still does not work, and an error message appears

config2.yaml: llm: api_type: "open_llm" # or azure / ollama etc. openai base_url: "http://0.0.0.0:8000/v1" api_key: "a" #It cannot be removed here. If removed, an error will be reported model: "vicuna"

testing examples/write_novel.py get err

image

elephone avatar Feb 18 '24 05:02 elephone

FastChat built using the content in the help document is not available in metaGPT v0.7

After modifying config2.yaml in v0.7, it still does not work, and an error message appears

config2.yaml: llm: api_type: "open_llm" # or azure / ollama etc. openai base_url: "http://0.0.0.0:8000/v1" api_key: "a" #It cannot be removed here. If removed, an error will be reported model: "vicuna"

testing examples/write_novel.py get err

image

I solved it. It's my problem

elephone avatar Feb 18 '24 15:02 elephone