text-generation-inference
text-generation-inference copied to clipboard
MODEL_ID propagation fix
Hi,
This is to fix the bug here: File "/opt/conda/lib/python3.10/site-packages/text_generation_server/models/flash_causal_lm.py", line 1160, in warmup f"tunableop_{MODEL_ID.replace('/', '-')}_tp{self.world_size}_rank{self.rank}.csv",
MODEL_ID is not properly propaged.
Please review this @OlivierDehaene @Narsil
Regards, Seungrok