Added installation docs.
Motivation
Added section on installation project using uv projects.
@zhaochenyang20 can you please take a look at this.
In the mean time, I think according to our current docs:
pip install --upgrade pip
pip install sgl-kernel --force-reinstall --no-deps
pip install "sglang[all]>=0.4.3.post2" --find-links https://flashinfer.ai/whl/cu124/torch2.5/flashinfer-python
is enough. Why should we use these:
pip install --upgrade pip
pip install setuptools
pip install sgl-kernel
pip install torch==2.5.1
pip install flashinfer-python -i https://flashinfer.ai/whl/cu124/torch2.5/
pip install "sglang[all]"
pip install transformers==4.48.3
Looks too strange.
Also, we do not need to tell the user how to use uv to create virtual environment. Only this is enough:
We recommend using uv to install the dependencies with a higher installation speed:
pip install --upgrade pip
pip install uv
uv pip install sgl-kernel --force-reinstall --no-deps
uv pip install "sglang[all]>=0.4.3.post2" --find-links https://flashinfer.ai/whl/cu124/torch2.5/flashinfer-python
In the mean time, I think according to our current docs:
pip install --upgrade pip pip install sgl-kernel --force-reinstall --no-deps pip install "sglang[all]>=0.4.3.post2" --find-links https://flashinfer.ai/whl/cu124/torch2.5/flashinfer-pythonis enough. Why should we use these:
pip install --upgrade pip pip install setuptools pip install sgl-kernel pip install torch==2.5.1 pip install flashinfer-python -i https://flashinfer.ai/whl/cu124/torch2.5/ pip install "sglang[all]" pip install transformers==4.48.3Looks too strange.
I think pip install transformers==4.48.3 is required, yesterday I installed transformers 4.49, there was some problems with that version
@simveit Hey simon. I made it rather concise. Could you use this version? Add keep contents like:
Note: SGLang currently uses torch 2.5, so you need to install the flashinfer version for torch 2.5. If you want to install flashinfer separately, please refer to FlashInfer installation doc. Please note that the package currently used by FlashInfer is named flashinfer-python, not flashinfer.
If you experience an error like OSError: CUDA_HOME environment variable is not set. Please set it to your CUDA install root, please try either of the following solutions:
Use export CUDA_HOME=/usr/local/cuda-
Follow the procedure described in FlashInfer installation doc first, then install SGLang as described above.
@simveit Hey simon. I made it rather concise. Could you use this version? Add keep contents like:
Note: SGLang currently uses torch 2.5, so you need to install the flashinfer version for torch 2.5. If you want to install flashinfer separately, please refer to FlashInfer installation doc. Please note that the package currently used by FlashInfer is named flashinfer-python, not flashinfer.
If you experience an error like OSError: CUDA_HOME environment variable is not set. Please set it to your CUDA install root, please try either of the following solutions:
Use export CUDA_HOME=/usr/local/cuda- to set the CUDA_HOME environment variable.
Follow the procedure described in FlashInfer installation doc first, then install SGLang as described above.
I can put this warning back in. The reason for the multiple libraries and deviation from current doc that I encountered error that setuptools is not installed and the CUDA_HOME error when I didn't install as in this PR. If we put the warning back in we can leave the installation of flashinfer out and refer the reader to the flashinfer docs.
@zhaochenyang20 i included your simpler way and it worked without the error i encountered before. only think i added was the transformers version we currently need.
I will combine this PR in a larger one:
https://github.com/sgl-project/sglang/pull/3601
And give you credit. @simveit