vllm icon indicating copy to clipboard operation
vllm copied to clipboard

[Doc]: update contributing guide for macOS Apple silicon

Open davidxia opened this issue 8 months ago • 2 comments

📚 The doc issue

https://docs.vllm.ai/en/stable/contributing/overview.html doesn't work on macOS Apple silicon.

https://docs.vllm.ai/en/latest/getting_started/installation/cpu.html?device=apple#build-wheel-from-source works but requirements/dev.txt isn't compatible.

$ pip install -r requirements/dev.txt
Collecting pre-commit==4.0.1 (from -r /Users/dxia/src/github.com/vllm-project/vllm/requirements/lint.txt (line 2))
  Downloading pre_commit-4.0.1-py2.py3-none-any.whl.metadata (1.3 kB)
Collecting absl-py==2.1.0 (from -r /Users/dxia/src/github.com/vllm-project/vllm/requirements/test.txt (line 3))
  Downloading absl_py-2.1.0-py3-none-any.whl.metadata (2.3 kB)
Collecting accelerate==1.0.1 (from -r /Users/dxia/src/github.com/vllm-project/vllm/requirements/test.txt (line 5))
  Downloading accelerate-1.0.1-py3-none-any.whl.metadata (19 kB)
Collecting aiohappyeyeballs==2.4.3 (from -r /Users/dxia/src/github.com/vllm-project/vllm/requirements/test.txt (line 9))
  Downloading aiohappyeyeballs-2.4.3-py3-none-any.whl.metadata (6.1 kB)
Collecting aiohttp==3.10.11 (from -r /Users/dxia/src/github.com/vllm-project/vllm/requirements/test.txt (line 11))
  Downloading aiohttp-3.10.11-cp312-cp312-macosx_11_0_arm64.whl.metadata (7.7 kB)
Collecting aiosignal==1.3.1 (from -r /Users/dxia/src/github.com/vllm-project/vllm/requirements/test.txt (line 16))
  Using cached aiosignal-1.3.1-py3-none-any.whl.metadata (4.0 kB)
Requirement already satisfied: annotated-types==0.7.0 in ./.venv/lib/python3.12/site-packages (from -r /Users/dxia/src/github.com/vllm-project/vllm/requirements/test.txt (line 20)) (0.7.0)
Collecting anyio==4.6.2.post1 (from -r /Users/dxia/src/github.com/vllm-project/vllm/requirements/test.txt (line 22))
  Using cached anyio-4.6.2.post1-py3-none-any.whl.metadata (4.7 kB)
Collecting argcomplete==3.5.1 (from -r /Users/dxia/src/github.com/vllm-project/vllm/requirements/test.txt (line 26))
  Downloading argcomplete-3.5.1-py3-none-any.whl.metadata (16 kB)
Collecting arrow==1.3.0 (from -r /Users/dxia/src/github.com/vllm-project/vllm/requirements/test.txt (line 28))
  Using cached arrow-1.3.0-py3-none-any.whl.metadata (7.5 kB)
Collecting attrs==24.2.0 (from -r /Users/dxia/src/github.com/vllm-project/vllm/requirements/test.txt (line 30))
  Using cached attrs-24.2.0-py3-none-any.whl.metadata (11 kB)
Collecting audioread==3.0.1 (from -r /Users/dxia/src/github.com/vllm-project/vllm/requirements/test.txt (line 38))
  Downloading audioread-3.0.1-py3-none-any.whl.metadata (8.4 kB)
Collecting awscli==1.35.23 (from -r /Users/dxia/src/github.com/vllm-project/vllm/requirements/test.txt (line 40))
  Downloading awscli-1.35.23-py3-none-any.whl.metadata (11 kB)
Collecting backoff==2.2.1 (from -r /Users/dxia/src/github.com/vllm-project/vllm/requirements/test.txt (line 42))
  Downloading backoff-2.2.1-py3-none-any.whl.metadata (14 kB)
ERROR: Could not find a version that satisfies the requirement bitsandbytes==0.45.3 (from versions: 0.31.8, 0.32.0, 0.32.1, 0.32.2, 0.32.3, 0.33.0, 0.33.1, 0.34.0, 0.35.0, 0.35.1, 0.35.2, 0.35.3, 0.35.4, 0.36.0, 0.36.0.post1, 0.36.0.post2, 0.37.0, 0.37.1, 0.37.2, 0.38.0, 0.38.0.post1, 0.38.0.post2, 0.38.1, 0.39.0, 0.39.1, 0.40.0, 0.40.0.post1, 0.40.0.post2, 0.40.0.post3, 0.40.0.post4, 0.40.1, 0.40.1.post1, 0.40.2, 0.41.0, 0.41.1, 0.41.2, 0.41.2.post1, 0.41.2.post2, 0.41.3, 0.41.3.post1, 0.41.3.post2, 0.42.0)
ERROR: No matching distribution found for bitsandbytes==0.45.3

After installing pre-commit manually with pip install pre-commit==4.0.1, running pre-commit fails because runai-model-streamer doesn't have a Mac wheel.

$ pre-commit run --all-files --show-diff-on-failure
[INFO] Initializing environment for https://github.com/google/yapf.
[INFO] Initializing environment for https://github.com/astral-sh/ruff-pre-commit.
[INFO] Initializing environment for https://github.com/codespell-project/codespell.
[INFO] Initializing environment for https://github.com/codespell-project/codespell:tomli.
[INFO] Initializing environment for https://github.com/PyCQA/isort.
[INFO] Initializing environment for https://github.com/pre-commit/mirrors-clang-format.
[INFO] Initializing environment for https://github.com/jackdewinter/pymarkdown.
[INFO] Initializing environment for https://github.com/rhysd/actionlint.
[INFO] Initializing environment for https://github.com/astral-sh/uv-pre-commit.
[INFO] Initializing environment for local:mypy==1.11.1,types-cachetools,types-setuptools,types-PyYAML,types-requests.
[INFO] Installing environment for https://github.com/google/yapf.
[INFO] Once installed this environment will be reused.
[INFO] This may take a few minutes...
[INFO] Installing environment for https://github.com/astral-sh/ruff-pre-commit.
[INFO] Once installed this environment will be reused.
[INFO] This may take a few minutes...
[INFO] Installing environment for https://github.com/codespell-project/codespell.
[INFO] Once installed this environment will be reused.
[INFO] This may take a few minutes...
[INFO] Installing environment for https://github.com/PyCQA/isort.
[INFO] Once installed this environment will be reused.
[INFO] This may take a few minutes...
[INFO] Installing environment for https://github.com/pre-commit/mirrors-clang-format.
[INFO] Once installed this environment will be reused.
[INFO] This may take a few minutes...
[INFO] Installing environment for https://github.com/jackdewinter/pymarkdown.
[INFO] Once installed this environment will be reused.
[INFO] This may take a few minutes...
[INFO] Installing environment for https://github.com/rhysd/actionlint.
[INFO] Once installed this environment will be reused.
[INFO] This may take a few minutes...
[INFO] Installing environment for https://github.com/astral-sh/uv-pre-commit.
[INFO] Once installed this environment will be reused.
[INFO] This may take a few minutes...
[INFO] Installing environment for local.
[INFO] Once installed this environment will be reused.
[INFO] This may take a few minutes...
[INFO] Installing environment for local.
[INFO] Once installed this environment will be reused.
[INFO] This may take a few minutes...
yapf.....................................................................Passed
ruff.....................................................................Passed
codespell................................................................Passed
isort....................................................................Passed
clang-format.............................................................Passed
PyMarkdown...............................................................Passed
Lint GitHub Actions workflow files.......................................Passed
pip-compile..............................................................Failed
- hook id: pip-compile
- exit code: 1

  × No solution found when resolving dependencies:
  ╰─▶ Because runai-model-streamer==0.11.0 has no wheels with a
      matching platform tag (e.g., `macosx_15_0_arm64`) and you require
      runai-model-streamer==0.11.0, we can conclude that your requirements
      are unsatisfiable.

      hint: Wheels are available for `runai-model-streamer` (v0.11.0) on the
      following platform: `manylinux2014_x86_64`

Run mypy for local Python installation...................................Passed
Lint shell scripts.......................................................Passed
Lint PNG exports from excalidraw.........................................Passed
Check SPDX headers.......................................................Passed
Check for spaces in all filenames........................................Passed
Update Dockerfile dependency graph.......................................Passed
Suggestion...............................................................Passed
- hook id: suggestion
- duration: 0.01s

To bypass pre-commit hooks, add --no-verify to git commit.

Suggest a potential alternative/fix

No response

Before submitting a new issue...

  • [x] Make sure you already searched for relevant issues, and asked the chatbot living at the bottom right corner of the documentation page, which can answer lots of frequently asked questions.

davidxia avatar Apr 21 '25 19:04 davidxia

I hit the same, the pip-compile always failed...

reidliu41 avatar Apr 22 '25 07:04 reidliu41

workaround for pre-commit

I tried to build runai-model-streamer from source on my Apple silicon Macbook with USE_BAZEL_VERSION=4.2.1 make build. It was not straightforward. Skipping the hook with SKIP=pip-compile pre-commit run --all-files --show-diff-on-failure succeeds.

notes on trying to build runai-model-streamer from source on my Apple silicon Macbook

Docs here. First I brew install bazelisk. USE_BAZEL_VERSION is documented here and matches runai-model-streamer's bazel version.

Got error /private/tmp/runai-model-streamer/cpp/utils/logging/BUILD:3:22: Compiling utils/logging/logging.cc failed: (Exit 1): cc_wrapper.sh failed: and gave up.

full error
$ USE_BAZEL_VERSION=4.2.1 make build
make -C py clean && \
        make build_x86_64 && \
        make build_aarch64
make -C runai_model_streamer clean
rm -rf build/ dist/ runai_model_streamer.egg-info/
make -C runai_model_streamer_s3 clean
rm -rf build/ dist/ runai_model_streamer_s3.egg-info/
make -C cpp clean && \
        make -C cpp build ARCH=x86_64 && \
        make -C py build ARCH=x86_64
bazel clean
INFO: Starting clean.
bazel build streamer:libstreamer.so \
                "--crosstool_top=@x86_64//:toolchain" && \
        bazel build s3:libstreamers3.so \
                --define USE_SYSTEM_LIBS= \
                --define BASE_PATH=x86_64 \
                "--crosstool_top=@x86_64//:toolchain"
INFO: Analyzed target //streamer:libstreamer.so (48 packages loaded, 155 targets configured).
INFO: Found 1 target...
ERROR: /private/tmp/runai-model-streamer/cpp/utils/logging/BUILD:3:22: Compiling utils/logging/logging.cc failed: (Exit 1): cc_wrapper.sh failed: error executing command external/x86_64/cc_wrapper.sh -U_FORTIFY_SOURCE -fstack-protector -Wall -Wthread-safety -Wself-assign -Wunused-but-set-parameter -Wno-free-nonheap-object -fcolor-diagnostics -fno-omit-frame-pointer ... (remaining 22 argument(s) skipped)

Use --sandbox_debug to see verbose messages from the sandbox
external/x86_64/cc_wrapper.sh: line 69: /usr/bin/x86_64-linux-gnu-gcc: No such file or directory
Target //streamer:libstreamer.so failed to build
Use --verbose_failures to see the command lines of failed build steps.
INFO: Elapsed time: 0.285s, Critical Path: 0.08s
INFO: 23 processes: 23 internal.
FAILED: Build did NOT complete successfully
make[2]: *** [build] Error 1
make[1]: *** [build_x86_64] Error 2
make: *** [build] Error 2

notes on pip install -r requirements/dev.txt

bitsandbytes==0.45.3 only supports Linux and Windows. The last version that might support Mac is 0.42.0. Related issues

  • https://github.com/bitsandbytes-foundation/bitsandbytes/issues/1020
  • https://github.com/bitsandbytes-foundation/bitsandbytes/issues/252

davidxia avatar Apr 30 '25 16:04 davidxia

+1 same problem.

Image

zhengkezhou1 avatar May 07 '25 02:05 zhengkezhou1

Skipping the hook with SKIP=pip-compile pre-commit run --all-files --show-diff-on-failure succeeds.

Great work @davidxia!

I don't fully understand, "Skipping the hook succeeds" but then you "gave up". Were you able to build runai-model-streamer with bazel 4.2.1? Could you open an issue or pull request?

bitsandbytes is working on Apple support for more than 2 years now, I wouldn't hold my breath waiting for it 😆

reneleonhardt avatar Jun 12 '25 05:06 reneleonhardt

Great work @davidxia!

I don't fully understand, "Skipping the hook succeeds" but then you "gave up".

Thanks, I gave up trying to run all the pre-commit hooks. So as long as one isn't changing any Python dependencies while developing on a Mac, it should be safe to skip that hook.

Were you able to build runai-model-streamer with bazel 4.2.1?

No

Could you open an issue or pull request?

For the runai-model-streamer issue?

davidxia avatar Jun 12 '25 14:06 davidxia

Were you able to build runai-model-streamer with bazel 4.2.1?

No

Thanks, I feared as much 😞

Could you open an issue or pull request?

For the runai-model-streamer issue?

Yes, maybe they want to fix their Apple builds 🤞

reneleonhardt avatar Jun 12 '25 15:06 reneleonhardt

attempt to make life easier https://github.com/vllm-project/vllm/pull/24177

panpan0000 avatar Sep 03 '25 12:09 panpan0000