pypi-support
pypi-support copied to clipboard
File Limit Request: llama-cpp-cffi - 200 MB
Project URL
https://github.com/tangledgroup/llama-cpp-cffi
Does this project already exist?
- [X] Yes
New Limit
200 MB
Update issue title
- [X] I have updated the title.
Which indexes
PyPI, TestPyPI
About the project
It has been active for about a month.
We compile llama.cpp with precompiled/linked CUDA versions (compute_61, compute_70, compute_75, compute_80, compute_86, compute_89, compute_90). Idea is that when you run pip install llama-cpp-cffi it just works without need to have C/CUDA compilers to build wheel.
We use upx to further reduce shared/dynamic libraries, but they are always just above 100MB.
We do not distribute example data.
Reasons for the request
In case we need to increase size in future it might be one of the following reasons:
- New CUDA architecture, so we need to recompile and include its code.
- New llama.cpp backend (we do not expect soon to happen since CUDA is major backend).
Code of Conduct
- [X] I agree to follow the PSF Code of Conduct
Hi, please take a look at our ticket. On PyPI we are two releases behind Github releases. Thanks. We expect to cover multiple CUDA releases in next few releases, so I will kindly request 200 MB limit :smile:
Hello @mtasic85 :wave:
I've set the upload limit for llama-cpp-cffi to 200 MB on PyPI, but not on TestPyPI because the project doesn't exist.
Have a nice week :rocket: