OpenHands icon indicating copy to clipboard operation
OpenHands copied to clipboard

Field "model_id" has conflict with protected namespace "model_".

Open rbren opened this issue 1 year ago • 6 comments

Describe the bug

It's just an annoying error log that happens when starting the backend. But would be nice to get rid of it as it pops up in bug reports etc. Seems like maybe an easy fix?

Setup and configuration

Current version:

❯ git log -n1
commit 7cc58b28a5da8e4dfd69cbb2144cd191cf905e07 (HEAD -> main, origin/main, origin/HEAD)
Author: Anas DORBANI <[email protected]>
Date:   Sat Apr 6 00:11:12 2024 +0000

    Fix pre-commit unset when using make build (#796)

My config.toml and environment vars (be sure to redact API keys):

LLM_MODEL="gpt-4-0125-preview"
WORKSPACE_DIR="./workspace"
LLM_EMBEDDING_MODEL="local"

My model and agent (you can see these settings in the UI):

  • Model: any
  • Agent: any

Commands I ran to install and run OpenDevin:

make build
uvicorn opendevin.server.listen:app --reload --port 3000 --host 0.0.0.0

Steps to Reproduce:

  1. Just start the backend

Logs, error messages, and screenshots:

❯ uvicorn opendevin.server.listen:app --reload --port 3000 --host 0.0.0.0
INFO:     Will watch for changes in these directories: ['/home/rbren/git/opendevin']
INFO:     Uvicorn running on http://0.0.0.0:3000 (Press CTRL+C to quit)
INFO:     Started reloader process [891143] using WatchFiles
/home/rbren/.local/share/virtualenvs/opendevin-0fQgowZe/lib/python3.12/site-packages/pydantic/_internal/_fields.py:151: UserWarning: Field "model_id" has conflict with protected namespace "model_".

You may be able to resolve this warning by setting `model_config['protected_namespaces'] = ()`.
  warnings.warn(
INFO:     Started server process [891145]
INFO:     Waiting for application startup.
INFO:     Application startup complete.

Additional Context

rbren avatar Apr 06 '24 00:04 rbren

same here

Soundmovin46 avatar Apr 06 '24 01:04 Soundmovin46

我这里错误较多,似乎无论我怎么配置config.toml都无法连接到本地ollama,总是提示如下错误,这是日志的最后部分: raise litellm.exceptions.BadRequestError ( # type:ignore litellm.exceptions.BadRequestError: LLM Provider NOTprovided。传入您尝试调用的LLM提供程序。您传递了 model=null 将模型传递为 Eg For 'Huggingface' 推理端点传入completion(model='huggingface/starcoder',..)了解更多信息:https:// /docs.litelm.ai/docs/providers 信息:连接已关闭

config.toml: LLM_MODEL="ollama/qwen:7b" LLM_API_KEY="ollama" LLM_EMBEDDING_MODEL="本地" LLM_BASE_URL=" http://loaclhost/ :<port_number>" WORKSPACE_DIR="./workspace"

Complete related logs: ERROR: Exception in ASGI application Traceback (most recent call last): File "/home/agent/miniconda3/envs/opendev/lib/python3.11/site-packages/uvicorn/protocols/websockets/websockets_impl.py", line 240, in run_asgi result = await self.app(self.scope, self.asgi_receive, self.asgi_send) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/agent/miniconda3/envs/opendev/lib/python3.11/site-packages/uvicorn/middleware/proxy_headers.py", line 69, in call return await self.app(scope, receive, send) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/agent/miniconda3/envs/opendev/lib/python3.11/site-packages/fastapi/applications.py", line 1054, in call await super().call(scope, receive, send) File "/home/agent/miniconda3/envs/opendev/lib/python3.11/site-packages/starlette/applications.py", line 123, in call await self.middleware_stack(scope, receive, send) File "/home/agent/miniconda3/envs/opendev/lib/python3.11/site-packages/starlette/middleware/errors.py", line 151, in call await self.app(scope, receive, send) File "/home/agent/miniconda3/envs/opendev/lib/python3.11/site-packages/starlette/middleware/cors.py", line 77, in call await self.app(scope, receive, send) File "/home/agent/miniconda3/envs/opendev/lib/python3.11/site-packages/starlette/middleware/exceptions.py", line 65, in call await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send) File "/home/agent/miniconda3/envs/opendev/lib/python3.11/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app raise exc File "/home/agent/miniconda3/envs/opendev/lib/python3.11/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app await app(scope, receive, sender) File "/home/agent/miniconda3/envs/opendev/lib/python3.11/site-packages/starlette/routing.py", line 756, in call await self.middleware_stack(scope, receive, send) File "/home/agent/miniconda3/envs/opendev/lib/python3.11/site-packages/starlette/routing.py", line 776, in app await route.handle(scope, receive, send) File "/home/agent/miniconda3/envs/opendev/lib/python3.11/site-packages/starlette/routing.py", line 373, in handle await self.app(scope, receive, send) File "/home/agent/miniconda3/envs/opendev/lib/python3.11/site-packages/starlette/routing.py", line 96, in app await wrap_app_handling_exceptions(app, session)(scope, receive, send) File "/home/agent/miniconda3/envs/opendev/lib/python3.11/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app raise exc File "/home/agent/miniconda3/envs/opendev/lib/python3.11/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app await app(scope, receive, sender) File "/home/agent/miniconda3/envs/opendev/lib/python3.11/site-packages/starlette/routing.py", line 94, in app await func(session) File "/home/agent/miniconda3/envs/opendev/lib/python3.11/site-packages/fastapi/routing.py", line 348, in app await dependant.call(**values) File "/home/agent/OpenDevin/opendevin/server/listen.py", line 26, in websocket_endpoint await session.start_listening() File "/home/agent/OpenDevin/opendevin/server/session.py", line 89, in start_listening await self.create_controller(data) File "/home/agent/OpenDevin/opendevin/server/session.py", line 135, in create_controller llm = LLM(model=model, api_key=api_key, base_url=api_base) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/agent/OpenDevin/opendevin/llm/llm.py", line 36, in init self._router = Router( ^^^^^^^ File "/home/agent/miniconda3/envs/opendev/lib/python3.11/site-packages/litellm/router.py", line 198, in init self.set_model_list(model_list) File "/home/agent/miniconda3/envs/opendev/lib/python3.11/site-packages/litellm/router.py", line 2075, in set_model_list ) = litellm.get_llm_provider( ^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/agent/miniconda3/envs/opendev/lib/python3.11/site-packages/litellm/utils.py", line 5759, in get_llm_provider raise e File "/home/agent/miniconda3/envs/opendev/lib/python3.11/site-packages/litellm/utils.py", line 5746, in get_llm_provider raise litellm.exceptions.BadRequestError( # type: ignore litellm.exceptions.BadRequestError: LLM Provider NOT provided. Pass in the LLM provider you are trying to call. You passed model=null Pass model as E.g. For 'Huggingface' inference endpoints pass in completion(model='huggingface/starcoder',..) Learn more: https://docs.litellm.ai/docs/providers

zhonggegege avatar Apr 06 '24 01:04 zhonggegege

the same question

/Library/Caches/pypoetry/virtualenvs/opendevin-AIbeNwSH-py3.11/lib/python3.11/site-packages/pydantic/_internal/fields.py:151: UserWarning: Field "model_info" has conflict with protected namespace "model".

You may be able to resolve this warning by setting model_config['protected_namespaces'] = ().

adcwa avatar Apr 06 '24 03:04 adcwa

me too

dockercore avatar Apr 08 '24 00:04 dockercore

The same issue appears to me when starting the back-end.

Warning Message:

/Users/luca/Library/Caches/pypoetry/virtualenvs/opendevin-rpD-2hkp-py3.12/lib/python3.12/site-packages/pydantic/_internal/_fields.py:151: UserWarning: Field "model_list" has conflict with protected namespace "model_".

You may be able to resolve this warning by setting `model_config['protected_namespaces'] = ()`.
  warnings.warn(
/Users/luca/Library/Caches/pypoetry/virtualenvs/opendevin-rpD-2hkp-py3.12/lib/python3.12/site-packages/pydantic/_internal/_fields.py:151: UserWarning: Field "model_name" has conflict with protected namespace "model_".

You may be able to resolve this warning by setting `model_config['protected_namespaces'] = ()`.
  warnings.warn(
/Users/luca/Library/Caches/pypoetry/virtualenvs/opendevin-rpD-2hkp-py3.12/lib/python3.12/site-packages/pydantic/_internal/_fields.py:151: UserWarning: Field "model_group_alias" has conflict with protected namespace "model_".

You may be able to resolve this warning by setting `model_config['protected_namespaces'] = ()`.
  warnings.warn(
/Users/luca/Library/Caches/pypoetry/virtualenvs/opendevin-rpD-2hkp-py3.12/lib/python3.12/site-packages/pydantic/_internal/_fields.py:151: UserWarning: Field "model_info" has conflict with protected namespace "model_".

You may be able to resolve this warning by setting `model_config['protected_namespaces'] = ()`.
  warnings.warn(
/Users/luca/Library/Caches/pypoetry/virtualenvs/opendevin-rpD-2hkp-py3.12/lib/python3.12/site-packages/pydantic/_internal/_fields.py:151: UserWarning: Field "model_id" has conflict with protected namespace "model_".

You may be able to resolve this warning by setting `model_config['protected_namespaces'] = ()`.
  warnings.warn(
INFO:     Started server process [35952]
INFO:     Waiting for application startup.
INFO:     Application startup complete.
INFO:     Uvicorn running on http://127.0.0.1:3000 (Press CTRL+C to quit)

my configuration: MacBook M1, Sonoma Python 3.10.4 Docker version 26.0.0, build 2ae903e86c Poetry (version 1.8.2)

opendevin version:

commit 707ab7b3f84fb5664ff63da0b52e7b0d2e4df545 (HEAD -> main, origin/main, origin/HEAD)
Author: Alex Bäuerle <[email protected]>
Date:   Tue Apr 9 17:42:16 2024 -0700

using ollama as LLM

config.toml

LLM_API_KEY="ollama"
LLM_MODEL="ollama/<model_name>"
LLM_EMBEDDING_MODEL="local"
LLM_BASE_URL="http://localhost:<port_number>"
WORKSPACE_DIR="./workspace"

ls-cnr avatar Apr 10 '24 07:04 ls-cnr

Same issue. Mac 2018, Sonoma, Conda, python 3.11,

You may be able to resolve this warning by setting model_config['protected_namespaces'] = (). warnings.warn( /Users/robshox/miniconda3/envs/open-devin/lib/python3.11/site-packages/pydantic/_internal/fields.py:151: UserWarning: Field "model_id" has conflict with protected namespace "model".

robshox avatar Apr 10 '24 11:04 robshox

Same here. Please help:

[email protected] start vite --port 3001

VITE v5.2.8 ready in 417 ms

➜ Local: http://localhost:3001/ ➜ Network: use --host to expose ➜ press h + enter to show help /Users/banhkho5a/Library/Caches/pypoetry/virtualenvs/opendevin-yIrhyWhc-py3.11/lib/python3.11/site-packages/pydantic/_internal/fields.py:151: UserWarning: Field "model_list" has conflict with protected namespace "model".

You may be able to resolve this warning by setting model_config['protected_namespaces'] = (). warnings.warn( /Users/banhkho5a/Library/Caches/pypoetry/virtualenvs/opendevin-yIrhyWhc-py3.11/lib/python3.11/site-packages/pydantic/_internal/fields.py:151: UserWarning: Field "model_name" has conflict with protected namespace "model".

You may be able to resolve this warning by setting model_config['protected_namespaces'] = (). warnings.warn( /Users/banhkho5a/Library/Caches/pypoetry/virtualenvs/opendevin-yIrhyWhc-py3.11/lib/python3.11/site-packages/pydantic/_internal/fields.py:151: UserWarning: Field "model_group_alias" has conflict with protected namespace "model".

You may be able to resolve this warning by setting model_config['protected_namespaces'] = (). warnings.warn( /Users/banhkho5a/Library/Caches/pypoetry/virtualenvs/opendevin-yIrhyWhc-py3.11/lib/python3.11/site-packages/pydantic/_internal/fields.py:151: UserWarning: Field "model_info" has conflict with protected namespace "model".

You may be able to resolve this warning by setting model_config['protected_namespaces'] = (). warnings.warn( /Users/banhkho5a/Library/Caches/pypoetry/virtualenvs/opendevin-yIrhyWhc-py3.11/lib/python3.11/site-packages/pydantic/_internal/fields.py:151: UserWarning: Field "model_id" has conflict with protected namespace "model".

You may be able to resolve this warning by setting model_config['protected_namespaces'] = (). warnings.warn( INFO: Started server process [55024] INFO: Waiting for application startup. INFO: Application startup complete. ERROR: [Errno 48] error while attempting to bind on address ('127.0.0.1', 3000): address already in use INFO: Waiting for application shutdown. INFO: Application shutdown complete. 3:56:05 AM [vite] .env changed, restarting server... 3:56:05 AM [vite] server restarted. 3:56:05 AM [vite] vite.config.ts changed, restarting server... 3:56:05 AM [vite] server restarted. 3:56:08 AM [vite] page reload index.html

banhkho5a avatar Apr 12 '24 10:04 banhkho5a

Just so everyone is aware: this is not a breaking issue, it's just an annoying warning

rbren avatar Apr 12 '24 11:04 rbren

Same here. Please help:

[email protected] start vite --port 3001

VITE v5.2.8 ready in 417 ms

➜ Local: http://localhost:3001/ ➜ Network: use --host to expose ➜ press h + enter to show help /Users/banhkho5a/Library/Caches/pypoetry/virtualenvs/opendevin-yIrhyWhc-py3.11/lib/python3.11/site-packages/pydantic/_internal/fields.py:151: UserWarning: Field "model_list" has conflict with protected namespace "model".

You may be able to resolve this warning by setting model_config['protected_namespaces'] = (). warnings.warn( /Users/banhkho5a/Library/Caches/pypoetry/virtualenvs/opendevin-yIrhyWhc-py3.11/lib/python3.11/site-packages/pydantic/_internal/fields.py:151: UserWarning: Field "model_name" has conflict with protected namespace "model".

You may be able to resolve this warning by setting model_config['protected_namespaces'] = (). warnings.warn( /Users/banhkho5a/Library/Caches/pypoetry/virtualenvs/opendevin-yIrhyWhc-py3.11/lib/python3.11/site-packages/pydantic/_internal/fields.py:151: UserWarning: Field "model_group_alias" has conflict with protected namespace "model".

You may be able to resolve this warning by setting model_config['protected_namespaces'] = (). warnings.warn( /Users/banhkho5a/Library/Caches/pypoetry/virtualenvs/opendevin-yIrhyWhc-py3.11/lib/python3.11/site-packages/pydantic/_internal/fields.py:151: UserWarning: Field "model_info" has conflict with protected namespace "model".

You may be able to resolve this warning by setting model_config['protected_namespaces'] = (). warnings.warn( /Users/banhkho5a/Library/Caches/pypoetry/virtualenvs/opendevin-yIrhyWhc-py3.11/lib/python3.11/site-packages/pydantic/_internal/fields.py:151: UserWarning: Field "model_id" has conflict with protected namespace "model".

You may be able to resolve this warning by setting model_config['protected_namespaces'] = (). warnings.warn( INFO: Started server process [55024] INFO: Waiting for application startup. INFO: Application startup complete. ERROR: [Errno 48] error while attempting to bind on address ('127.0.0.1', 3000): address already in use INFO: Waiting for application shutdown. INFO: Application shutdown complete. 3:56:05 AM [vite] .env changed, restarting server... 3:56:05 AM [vite] server restarted. 3:56:05 AM [vite] vite.config.ts changed, restarting server... 3:56:05 AM [vite] server restarted. 3:56:08 AM [vite] page reload index.html

i had this issue too, i just rebuilt the app, heres the commands i used

git clone https://github.com/OpenDevin/OpenDevin.git cd OpenDevin make build pip uninstall poetry cleo pip cache purge rm -rf ~/.cache/pypoetry rm -rf ~/.poetry curl -sSL https://install.python-poetry.org | python3 - which poetry export PATH="$PATH:/home/dev/.local/bin/poetry" export OPENAI_API_KEY="your-api-key" make build npm fund

ArcTens avatar Apr 12 '24 17:04 ArcTens

run poetry update litellm

SmartManoj avatar Apr 13 '24 12:04 SmartManoj

ERROR: [Errno 48] error while attempting to bind on address ('127.0.0.1', 3000): address already in use
running into this alot, thought it was some jank with poetry i flubbed installing a different app turns out the port can stay occupied

fuser -k 3001/tcp clears it up

ArcTens avatar Apr 14 '24 07:04 ArcTens

Fixed now!

rbren avatar Apr 15 '24 08:04 rbren