FastChat icon indicating copy to clipboard operation
FastChat copied to clipboard

Pydantic version conflict with vLLM

Open thiner opened this issue 1 year ago • 1 comments
trafficstars

In short, vLLM depends on pydantic >= 2,

pydantic >= 2.0  # Required for OpenAI server.

on the other hand, fastchat/serve/openai_api_server.py depends on v1.x

try:
    from pydantic.v1 import BaseSettings
except ImportError:
    from pydantic import BaseSettings

I installed the dependencies by pip install fschat[model_worker] vllm, it causes error at runtime:

 Traceback (most recent call last):

   File "/usr/lib/python3.10/runpy.py", line 196, in _run_module_as_main

     return _run_code(code, main_globals, None,

   File "/usr/lib/python3.10/runpy.py", line 86, in _run_code

     exec(code, run_globals)

   File "/usr/local/lib/python3.10/dist-packages/fastchat/serve/openai_api_server.py", line 24, in <module>

     from pydantic import BaseSettings

   File "/usr/local/lib/python3.10/dist-packages/pydantic/__init__.py", line 374, in __getattr__

     return _getattr_migration(attr_name)

   File "/usr/local/lib/python3.10/dist-packages/pydantic/_migration.py", line 296, in wrapper

     raise PydanticImportError(

 pydantic.errors.PydanticImportError: `BaseSettings` has been moved to the `pydantic-settings` package. See https://docs.pydantic.dev/2.6/migration/#basesettings-has-moved-to-pydantic-settings for more details.

thiner avatar Feb 21 '24 03:02 thiner

Any love for this one? It would sure be great to have it running with vLLM...

digitalscream avatar Apr 01 '24 16:04 digitalscream