BentoML icon indicating copy to clipboard operation
BentoML copied to clipboard

bug: Protocol not available error

Open neodawn opened this issue 2 years ago • 2 comments

Describe the bug

In Windows Subsystem for Linux (WSL), I got an error when I ran the following command: openllm start opt

(.venv) ak@ak:/mnt/c/Tools/openllm$ openllm start opt
2023-06-27T16:53:32+0530 [WARNING] [cli] 'CUDA_VISIBLE_DEVICES' has no effect when only CPU is available.
2023-06-27T16:53:33+0530 [INFO] [cli] Prometheus metrics for HTTP BentoServer from "_service.py:svc" can be accessed at http://localhost:3000/metrics.
2023-06-27T16:53:34+0530 [ERROR] [cli] Exception in callback <bound method Arbiter.manage_watchers of <circus.arbiter.Arbiter object at 0x7fd0dc3b8040>>
Traceback (most recent call last):
  File "/mnt/c/Tools/openllm/.venv/lib/python3.10/site-packages/tornado/ioloop.py", line 919, in _run
    val = self.callback()
  File "/mnt/c/Tools/openllm/.venv/lib/python3.10/site-packages/circus/util.py", line 1038, in wrapper
    raise ConflictError("arbiter is already running %s command"
circus.exc.ConflictError: arbiter is already running arbiter_start_watchers command
2023-06-27T16:53:35+0530 [ERROR] [cli] Exception in callback <bound method Arbiter.manage_watchers of <circus.arbiter.Arbiter object at 0x7fd0dc3b8040>>
Traceback (most recent call last):
  File "/mnt/c/Tools/openllm/.venv/lib/python3.10/site-packages/tornado/ioloop.py", line 919, in _run
    val = self.callback()
  File "/mnt/c/Tools/openllm/.venv/lib/python3.10/site-packages/circus/util.py", line 1038, in wrapper
    raise ConflictError("arbiter is already running %s command"
circus.exc.ConflictError: arbiter is already running arbiter_start_watchers command
2023-06-27T16:53:35+0530 [INFO] [cli] Starting production HTTP BentoServer from "_service.py:svc" listening on http://0.0.0.0:3000 (Press CTRL+C to quit)
Traceback (most recent call last):
  File "/usr/lib/python3.10/runpy.py", line 196, in _run_module_as_main
    return _run_code(code, main_globals, None,
  File "/usr/lib/python3.10/runpy.py", line 86, in _run_code
    exec(code, run_globals)
  File "/mnt/c/Tools/openllm/.venv/lib/python3.10/site-packages/bentoml_cli/worker/http_api_server.py", line 189, in <module>
    main()  # pylint: disable=no-value-for-parameter
  File "/mnt/c/Tools/openllm/.venv/lib/python3.10/site-packages/click/core.py", line 1130, in __call__
    return self.main(*args, **kwargs)
  File "/mnt/c/Tools/openllm/.venv/lib/python3.10/site-packages/click/core.py", line 1055, in main
    rv = self.invoke(ctx)
  File "/mnt/c/Tools/openllm/.venv/lib/python3.10/site-packages/click/core.py", line 1404, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/mnt/c/Tools/openllm/.venv/lib/python3.10/site-packages/click/core.py", line 760, in invoke
    return __callback(*args, **kwargs)
  File "/mnt/c/Tools/openllm/.venv/lib/python3.10/site-packages/bentoml_cli/worker/http_api_server.py", line 183, in main
    sock = socket.socket(fileno=fd)
  File "/usr/lib/python3.10/socket.py", line 232, in __init__
    _socket.socket.__init__(self, family, type, proto, fileno)
OSError: [Errno 92] Protocol not available
Traceback (most recent call last):
  File "/usr/lib/python3.10/runpy.py", line 196, in _run_module_as_main
    return _run_code(code, main_globals, None,
  File "/usr/lib/python3.10/runpy.py", line 86, in _run_code
    exec(code, run_globals)
  File "/mnt/c/Tools/openllm/.venv/lib/python3.10/site-packages/bentoml_cli/worker/http_api_server.py", line 189, in <module>
    main()  # pylint: disable=no-value-for-parameter
  File "/mnt/c/Tools/openllm/.venv/lib/python3.10/site-packages/click/core.py", line 1130, in __call__
    return self.main(*args, **kwargs)
  File "/mnt/c/Tools/openllm/.venv/lib/python3.10/site-packages/click/core.py", line 1055, in main
    rv = self.invoke(ctx)
  File "/mnt/c/Tools/openllm/.venv/lib/python3.10/site-packages/click/core.py", line 1404, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/mnt/c/Tools/openllm/.venv/lib/python3.10/site-packages/click/core.py", line 760, in invoke
    return __callback(*args, **kwargs)
  File "/mnt/c/Tools/openllm/.venv/lib/python3.10/site-packages/bentoml_cli/worker/http_api_server.py", line 183, in main
    sock = socket.socket(fileno=fd)
  File "/usr/lib/python3.10/socket.py", line 232, in __init__
    _socket.socket.__init__(self, family, type, proto, fileno)
OSError: [Errno 92] Protocol not available
Traceback (most recent call last):
  File "/usr/lib/python3.10/runpy.py", line 196, in _run_module_as_main
    return _run_code(code, main_globals, None,
  File "/usr/lib/python3.10/runpy.py", line 86, in _run_code
    exec(code, run_globals)
  File "/mnt/c/Tools/openllm/.venv/lib/python3.10/site-packages/bentoml_cli/worker/runner.py", line 151, in <module>
    main()  # pylint: disable=no-value-for-parameter
  File "/mnt/c/Tools/openllm/.venv/lib/python3.10/site-packages/click/core.py", line 1130, in __call__
    return self.main(*args, **kwargs)
  File "/mnt/c/Tools/openllm/.venv/lib/python3.10/site-packages/click/core.py", line 1055, in main
    rv = self.invoke(ctx)
  File "/mnt/c/Tools/openllm/.venv/lib/python3.10/site-packages/click/core.py", line 1404, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/mnt/c/Tools/openllm/.venv/lib/python3.10/site-packages/click/core.py", line 760, in invoke
    return __callback(*args, **kwargs)
  File "/mnt/c/Tools/openllm/.venv/lib/python3.10/site-packages/bentoml_cli/worker/runner.py", line 145, in main
    sock = socket.socket(fileno=fd)
  File "/usr/lib/python3.10/socket.py", line 232, in __init__
    _socket.socket.__init__(self, family, type, proto, fileno)

To reproduce

  1. pip install openllm
  2. openllm start opt

Logs

No response

Environment

Environment variable

BENTOML_DEBUG=''
BENTOML_QUIET=''
BENTOML_BUNDLE_LOCAL_BUILD=''
BENTOML_DO_NOT_TRACK=''
BENTOML_CONFIG=''
BENTOML_CONFIG_OPTIONS=''
BENTOML_PORT=''
BENTOML_HOST=''
BENTOML_API_WORKERS=''

System information

bentoml: 1.0.22 python: 3.10.6 platform: Linux-4.4.0-19041-Microsoft-x86_64-with-glibc2.35 uid_gid: 1000:1000

pip_packages
absl-py==1.4.0
accelerate==0.20.3
aiohttp==3.8.4
aiosignal==1.3.1
anyio==3.7.0
appdirs==1.4.4
asgiref==3.7.2
astunparse==1.6.3
async-timeout==4.0.2
attrs==23.1.0
bentoml==1.0.22
build==0.10.0
cached-property==1.5.2
cachetools==5.3.1
cattrs==23.1.2
certifi==2023.5.7
charset-normalizer==3.1.0
chex==0.1.7
circus==0.18.0
click==8.1.3
click-option-group==0.5.6
cloudpickle==2.2.1
cmake==3.26.4
coloredlogs==15.0.1
contextlib2==21.6.0
datasets==2.13.1
deepmerge==1.1.0
Deprecated==1.2.14
dill==0.3.6
dm-tree==0.1.8
etils==1.3.0
exceptiongroup==1.1.1
filelock==3.12.2
filetype==1.2.0
flatbuffers==23.5.26
flax==0.6.11
frozenlist==1.3.3
fs==2.4.16
fsspec==2023.6.0
gast==0.4.0
google-auth==2.21.0
google-auth-oauthlib==1.0.0
google-pasta==0.2.0
grpcio==1.56.0
grpcio-health-checking==1.48.2
h11==0.14.0
h5py==3.9.0
httpcore==0.17.2
httpx==0.24.1
huggingface-hub==0.15.1
humanfriendly==10.0
idna==3.4
importlib-metadata==6.0.1
importlib-resources==5.12.0
inflection==0.5.1
jax==0.4.13
jaxlib==0.4.13
Jinja2==3.1.2
keras==2.12.0
libclang==16.0.0
lit==16.0.6
Markdown==3.4.3
markdown-it-py==3.0.0
MarkupSafe==2.1.3
mdurl==0.1.2
ml-dtypes==0.2.0
mpmath==1.3.0
msgpack==1.0.5
multidict==6.0.4
multiprocess==0.70.14
nest-asyncio==1.5.6
networkx==3.1
numpy==1.23.5
nvidia-cublas-cu11==11.10.3.66
nvidia-cuda-cupti-cu11==11.7.101
nvidia-cuda-nvrtc-cu11==11.7.99
nvidia-cuda-runtime-cu11==11.7.99
nvidia-cudnn-cu11==8.5.0.96
nvidia-cufft-cu11==10.9.0.58
nvidia-curand-cu11==10.2.10.91
nvidia-cusolver-cu11==11.4.0.1
nvidia-cusparse-cu11==11.7.4.91
nvidia-nccl-cu11==2.14.3
nvidia-nvtx-cu11==11.7.91
oauthlib==3.2.2
openllm==0.1.16
opentelemetry-api==1.17.0
opentelemetry-instrumentation==0.38b0
opentelemetry-instrumentation-aiohttp-client==0.38b0
opentelemetry-instrumentation-asgi==0.38b0
opentelemetry-instrumentation-grpc==0.38b0
opentelemetry-sdk==1.17.0
opentelemetry-semantic-conventions==0.38b0
opentelemetry-util-http==0.38b0
opt-einsum==3.3.0
optax==0.1.5
optimum==1.8.8
orbax-checkpoint==0.2.6
orjson==3.9.1
packaging==23.1
pandas==2.0.2
pathspec==0.11.1
Pillow==9.5.0
pip-requirements-parser==32.0.1
pip-tools==6.13.0
prometheus-client==0.17.0
protobuf==3.20.3
psutil==5.9.5
pyarrow==12.0.1
pyasn1==0.5.0
pyasn1-modules==0.3.0
pydantic==1.10.9
Pygments==2.15.1
pynvml==11.5.0
pyparsing==3.1.0
pyproject_hooks==1.0.0
python-dateutil==2.8.2
python-json-logger==2.0.7
python-multipart==0.0.6
pytz==2023.3
PyYAML==6.0
pyzmq==25.1.0
regex==2023.6.3
requests==2.31.0
requests-oauthlib==1.3.1
rich==13.4.2
rsa==4.9
safetensors==0.3.1
schema==0.7.5
scipy==1.11.0
sentencepiece==0.1.99
simple-di==0.1.5
six==1.16.0
sniffio==1.3.0
starlette==0.28.0
sympy==1.12
tabulate==0.9.0
tensorboard==2.12.3
tensorboard-data-server==0.7.1
tensorflow==2.12.0
tensorflow-estimator==2.12.0
tensorflow-io-gcs-filesystem==0.32.0
tensorstore==0.1.39
termcolor==2.3.0
tokenizers==0.13.3
tomli==2.0.1
toolz==0.12.0
torch==2.0.1
torchvision==0.15.2
tornado==6.3.2
tqdm==4.65.0
transformers==4.30.2
triton==2.0.0
typing_extensions==4.6.3
tzdata==2023.3
urllib3==1.26.16
uvicorn==0.22.0
watchfiles==0.19.0
wcwidth==0.2.6
Werkzeug==2.3.6
wrapt==1.14.1
xxhash==3.2.0
yarl==1.9.2
zipp==3.15.0
  • transformers version: 4.30.2
  • Platform: Linux-4.4.0-19041-Microsoft-x86_64-with-glibc2.35
  • Python version: 3.10.6
  • Huggingface_hub version: 0.15.1
  • Safetensors version: 0.3.1
  • PyTorch version (GPU?): 2.0.1+cu117 (False)
  • Tensorflow version (GPU?): 2.12.0 (False)
  • Flax version (CPU?/GPU?/TPU?): 0.6.11 (cpu)
  • Jax version: 0.4.13
  • JaxLib version: 0.4.13
  • Using GPU in script?: No
  • Using distributed or parallel set-up in script?: No

System information (Optional)

No response

neodawn avatar Jun 27 '23 11:06 neodawn

Going to transfer this upstream to BentoML since it is a BentoML issue.

aarnphm avatar Sep 06 '23 19:09 aarnphm

Has there been any progress here? I'm experiencing the same issue on a brand new installation as of now. I'm only running BentoML with the Iris example on WSL1.

azachar avatar Sep 14 '23 11:09 azachar