pypi-support icon indicating copy to clipboard operation
pypi-support copied to clipboard

File Limit Request: vllm - 400 MiB

Open youkaichao opened this issue 1 year ago • 6 comments

Project URL

https://pypi.org/project/vllm/

Does this project already exist?

  • [X] Yes

New Limit

400

Update issue title

  • [X] I have updated the title.

Which indexes

PyPI

About the project

vLLM is a fast and easy-to-use library for LLM inference and serving.

It plans to ship nvidia-nccl-cu12==2.18.3 within the package.

Reasons for the request

We identified nccl>=2.19 with a bug that largely increased GPU memory overhead, so we have to pin and ship nccl versions ourselves.

We cannot use pip install nvidia-nccl-cu12==2.18.3 because we depend on torch, which has binary dependency with pip install nvidia-nccl-cu12==2.19.5. So we are in a dependency hell, and we have to keep a nccl library ourselves.

vllm is a popular library for LLM inference, and it is used by many tech companies. Shipping nccl with vllm can increase its throughput and the quality of LLM serving. However, the downside is that the package wheel will become much larger. So we have to come here for support, to ask for a larger file size limit.

Code of Conduct

  • [X] I agree to follow the PSF Code of Conduct

youkaichao avatar Mar 26 '24 05:03 youkaichao

bump up 👀

youkaichao avatar Mar 29 '24 22:03 youkaichao

bump up 👀

youkaichao avatar Apr 03 '24 18:04 youkaichao

+1, it would be great to have this!

mgoin avatar Apr 23 '24 19:04 mgoin

From README.md

Large (more than 200MiB) upload limits are generally granted for the following reasons:

project contains large compiled binaries to maintain platform/architecture/GPU support

Project maintainers are having to limit or cut architecture/GPU/format support in order to fit <100mb: vllm-project/vllm#4290 vllm-project/vllm#4304

agt avatar Apr 23 '24 23:04 agt

Kindly cc @cmaureir for visibility. vLLM is the most popular open-source LLM serving engine in the world right now. Having a larger package limit can help us support more different types of hardware, and help democratize LLMs to the vast majority of developers.

zhuohan123 avatar Apr 24 '24 07:04 zhuohan123

Hi @cmaureir, I'm also a maintainer of vLLM. We do make our best effort to keep the binary size small, but it's increasingly difficult to meet the current limit since vLLM is rapidly growing with new features and optimizations that require new GPU kernels (binaries). Increasing the limit would be very helpful for the development of vLLM.

WoosukKwon avatar Apr 24 '24 07:04 WoosukKwon

Hello @youkaichao :wave: I have set the new upload limit for vllm to 400M mainly to unlock your release processes, but I'm making a note that it's highly probable your project will reach the project limit soon because it's including an additional package. This is not encouraged, nor recommended.

Additionally, I see you have one package per-python version, which heavily increases the release total size, I recommend you to look into the Python Limited API in order to provide one-wheel per platform. https://docs.python.org/3/c-api/stable.html

Have a nice rest of the week :rocket:

cmaureir avatar May 08 '24 07:05 cmaureir

@cmaureir thanks for your support! We will try to see if we can build just one wheel for all python versions.

youkaichao avatar May 08 '24 18:05 youkaichao

@cmaureir is it possible to build one wheel for all supported python version, when we have extensions? I find the wheel name always contains python version. Not sure how to build a Python-agnostic wheel.

youkaichao avatar May 08 '24 19:05 youkaichao

I did a quick investigation:

To use Python Limited API in order to provide one-wheel per platform:

  1. add flags to wheel building: python3 setup.py bdist_wheel --dist-dir=dist --py-limited-api=cp38
  2. add macro during compilation: # define Py_LIMITED_API 0x30800000 (or can be set by extension arguments, c.f. https://stackoverflow.com/a/69073115/9191338 )

I tried, however, since we use pybind11, which does not support Python Limited API (c.f. https://github.com/pybind/pybind11/issues/1755 ), we have to build one wheel for each Python version.

Sorry for the trouble :(

youkaichao avatar May 08 '24 20:05 youkaichao

Hi @cmaureir, I would like to inquire the current total usage of vLLM packages and whether we can increase the project limit of 10GB. We have made quite some progress over the last few months. We are finally releasing version agnostic wheels.

simon-mo avatar Jul 23 '24 07:07 simon-mo