lorax icon indicating copy to clipboard operation
lorax copied to clipboard

if LoRAX is based on punica kernels will it be able to support LoRA Adapters for Mistral NeMO 12B?

Open tensimixt opened this issue 1 year ago • 2 comments

Feature request

if LoRAX is based on punica kernels will it be able to support LoRA Adapters for Mistral NeMO 12B? which has a vocab size > 130k. Currently Vllm for example doesn't support vocab_size > 128512 when enable_lora=True

I think if huggingface and LoRAX are based on punica kernels they will also have this limitation or this limitation does not exist for TGI and LoRAX?

Thank you!

Motivation

be able to run inference with Mistral NeMO + LoRA Adapter (in a multi-lora world)

Your contribution

Checked various deployment providers and found the limitation

tensimixt avatar Jul 20 '24 12:07 tensimixt

Feature request

if LoRAX is based on punica kernels will it be able to support LoRA Adapters for Mistral NeMO 12B? which has a vocab size > 130k. Currently Vllm for example doesn't support vocab_size > 128512 when enable_lora=True

I think if huggingface and LoRAX are based on punica kernels they will also have this limitation or this limitation does not exist for TGI and LoRAX?

Thank you!

Motivation

be able to run inference with Mistral NeMO + LoRA Adapter (in a multi-lora world)

Your contribution

Checked various deployment providers and found the limitation

did you figure out if Mistral Nemo 12B works with lora adapters with lorax? It does not work with VLLM or Aphrodite still and I am looking for alternatives.

Nero10578 avatar Aug 20 '24 20:08 Nero10578

Feature request

if LoRAX is based on punica kernels will it be able to support LoRA Adapters for Mistral NeMO 12B? which has a vocab size > 130k. Currently Vllm for example doesn't support vocab_size > 128512 when enable_lora=True I think if huggingface and LoRAX are based on punica kernels they will also have this limitation or this limitation does not exist for TGI and LoRAX? Thank you!

Motivation

be able to run inference with Mistral NeMO + LoRA Adapter (in a multi-lora world)

Your contribution

Checked various deployment providers and found the limitation

did you figure out if Mistral Nemo 12B works with lora adapters with lorax? It does not work with VLLM or Aphrodite still and I am looking for alternatives.

Did you find alternatives to vllm? I still strugle with this problem of serving mistral_nemo with lora

preduct0r avatar Oct 31 '24 21:10 preduct0r