Jee Jee Li
Jee Jee Li
Thank you for your contribution, will look at this ASAP
Thank you, will look at this PR ASAP
@alex-jw-brooks Please sync with the main branch to verify if it can fix the CI failure
Maybe similar issue with https://github.com/vllm-project/vllm/issues/10656, cc @mgoin
Could you try https://github.com/vllm-project/vllm/pull/17435? Please rebuild from source
Thank you for your contribution, I will look at this PR asap.
Ok, I'm changing it to WIP status now.
IMHO, this implementation is a bit hacky.
> @jeejeelee I think the way the code is structured and us not being able to enforce the `AWS` dependencies for different builds, neuron, tpu etc. This was the best...
`modelscope` is not a default dependency in the package, similar to bitsandbytes. It might be better to handle it like BNB does: https://github.com/vllm-project/vllm/blob/main/vllm/model_executor/layers/quantization/bitsandbytes.py#L158