[Bug] FlashInfer latest main wheel issue
Thank you all @xslingcn @ur4t @yzh119 for the recent updates to the main branch. I've noticed some positive developments:
-
The enhanced Python version compatibility across multiple previous versions to support 3.8+
-
The successful integration of FA3 support with JIT kernels
-
The updated packaging methodology
While these changes are individually promising, there are a few items we might want to address before proceeding with the v0.2 release:
-
We've observed some compatibility nuances - specifically regarding wheels compiled with Python 3.11 running in Python 3.10 environments. Perhaps we could explore ways to ensure seamless cross-version compatibility? https://github.com/flashinfer-ai/flashinfer/pull/662#issuecomment-2545885047
-
Regarding the compilation setup, we've received feedback from upstream projects like SGLang that some users may not have access to compilation tools like ninja. Would it be worth considering AOT as the default approach to make the experience more user-friendly? https://github.com/sgl-project/sglang/pull/2490#issuecomment-2545868818
-
While the new packaging method shows promise, the nightly build system for the latest main version could benefit from some attention to ensure continuous delivery remains smooth.
For reference, I noticed the last available nightly build is version 0.1.6+6819a0f https://github.com/flashinfer-ai/flashinfer-nightly/releases/tag/0.1.6%2B6819a0f. Would you like to discuss any of these points in more detail? I'm happy to collaborate on finding solutions that work well for everyone.
- Regarding the compilation setup, we've received feedback from upstream projects like SGLang that some users may not have access to compilation tools like ninja. Would it be worth considering AOT as the default approach to make the experience more user-friendly?
It is fixed in #659, which requires Ninja when installed in JIT mode.
- We've observed some compatibility nuances - specifically regarding wheels compiled with Python 3.11 running in Python 3.10 environments. Perhaps we could explore ways to ensure seamless cross-version compatibility?
If I didn't get it wrong, it sets some C macros (ref: https://docs.python.org/3/c-api/stable.html#c.Py_LIMITED_API), which means it requires that Torch has already adopted it?
It is fixed in https://github.com/flashinfer-ai/flashinfer/pull/659, which requires Ninja when installed in JIT mode.
Yeah. I see. The current concern is that it has become more troublesome for users, such as needing to ensure the user's environment has nvcc and ninja. I think this is not difficult to support. I can make it a dependency of SGLang. This discussion is about how to continue maintaining user-friendliness better.
ref https://github.com/flashinfer-ai/flashinfer/pull/476
For now, point 3 has been temporarily rolled back and can be enhanced at a later stage. Let's focus on addressing points 1 and 2 first. ref https://github.com/flashinfer-ai/flashinfer-nightly/releases/tag/0.2.0%2B2bc3214 https://github.com/sgl-project/sglang/actions/runs/12374021683
@zhyncs @yzh119 are there any remaining open issues here? Or can we close?