pypi-support icon indicating copy to clipboard operation
pypi-support copied to clipboard

File Limit Request: llama-cpp-cffi - 200 MB

Open mtasic85 opened this issue 1 year ago • 1 comments

Project URL

https://github.com/tangledgroup/llama-cpp-cffi

Does this project already exist?

  • [X] Yes

New Limit

200 MB

Update issue title

  • [X] I have updated the title.

Which indexes

PyPI, TestPyPI

About the project

It has been active for about a month.

We compile llama.cpp with precompiled/linked CUDA versions (compute_61, compute_70, compute_75, compute_80, compute_86, compute_89, compute_90). Idea is that when you run pip install llama-cpp-cffi it just works without need to have C/CUDA compilers to build wheel.

We use upx to further reduce shared/dynamic libraries, but they are always just above 100MB.

We do not distribute example data.

Reasons for the request

In case we need to increase size in future it might be one of the following reasons:

  1. New CUDA architecture, so we need to recompile and include its code.
  2. New llama.cpp backend (we do not expect soon to happen since CUDA is major backend).

Code of Conduct

  • [X] I agree to follow the PSF Code of Conduct

mtasic85 avatar Jul 22 '24 08:07 mtasic85

Hi, please take a look at our ticket. On PyPI we are two releases behind Github releases. Thanks. We expect to cover multiple CUDA releases in next few releases, so I will kindly request 200 MB limit :smile:

mtasic85 avatar Jul 25 '24 13:07 mtasic85

Hello @mtasic85 :wave: I've set the upload limit for llama-cpp-cffi to 200 MB on PyPI, but not on TestPyPI because the project doesn't exist. Have a nice week :rocket:

cmaureir avatar Aug 12 '24 13:08 cmaureir