llm-foundry
llm-foundry copied to clipboard
Installation issue from habana_alpha branch
Removed flash attn dependencies from requirements.txt, and it worked. Should it be installed for gaudi?
That's right, the flash-attn dependenices should not be required for Gaudi. If you'd like to submit a PR to this branch editing the setup.py requirements, please go for it!