abhishek thakur

Results 116 comments of abhishek thakur

it should now be fixed. please let me know otherwise

this is a major overlook on my part. apologies. fixed. fix is being deployed. please factory rebuild and try after ~25mins.

its out of the box. did you manually change accelerate config?

i think there's been a misunderstanding. > could you kindly let me know how exactly do I do that? This is not written anywhere in the documentation for autotrain-advanced, and...

could you run 'accelerate config' answer the questions and then run the autotrain command and see if that fixes your issue?

``` TLDR: can run X block size on one GPU OOM when running X + 2k on one GPU OOM when running X + 2k on two GPUs conclusion: autotrain...

also adding, im able to finetune mixtal 8x7b model on 8xA100 using autotrain which is never possible without using multiple gpus :)

please be patient @jackshiwl . many times, immediate response is not possible :) if your sentences are small and you are using large max len, it means there will be...

fixed. fix is being deployed. please factory rebuild and try after ~25mins.

its a non issue. @Filarh please keep your opinion to yourself. if you have a lot of time, why dont you make a pr to fix this instead?