ColossalAI
ColossalAI copied to clipboard
Get error when want to use Colossal-LLaMA-2
Discussed in https://github.com/hpcaitech/ColossalAI/discussions/5393
Originally posted by zhpacer February 20, 2024
CUDA: Build cuda_11.8.r11.8/compiler.31833905_0
following: https://github.com/hpcaitech/ColossalAI/tree/main/applications/Colossal-LLaMA-2
Error:
utils/flash_attention_patch.py", line 22, in