Daniel Han
Daniel Han
@luke-lombardi Yes related issue to https://github.com/unslothai/unsloth/issues/501 and https://github.com/unslothai/unsloth/issues/504
Oh no that's not normally a good issue - it normally means something broke with the installation of Triton
Oh interesting someone did ask about this issue - working on a fix
OHH! @erwe324 Thanks for that!! Oh wait maybe I should actually `try except` this lol
Yes sadly can reproduce :( Pytorch 2.3 destroyed T4 GPUs
I'm working on a quick fix
For now ```python %%capture # Installs Unsloth, Xformers (Flash Attention) and all other packages! !pip install "unsloth[colab-new] @ git+https://github.com/unslothai/unsloth.git" !pip install --no-deps "torch
I'm currently updating people via Twitter: https://x.com/danielhanchen/status/1792982364894929083 In terms of the issue, it seems like Pytorch 2.3 is saying T4s can support bfloat16, but it actually cannot. 
@raj-chinagundi @DrewThomasson Fixed! Please change all install instructions at the top to ```python %%capture # Installs Unsloth, Xformers (Flash Attention) and all other packages! !pip install "unsloth[colab-new] @ git+https://github.com/unslothai/unsloth.git" !pip...
Hmm its possible the ONNX parameters are not optimal hence the slowdown