dalecai666
dalecai666
Regarding the issue of being unable to install the flash-attn package, we suggest referring to the official installation tutorial (https://github.com/Dao-AILab/flash-attention). Before installation, please check your local environment against the tutorial,...
We sincerely apologize for not following up on your needs in a timely manner. Our evaluation process follows the workflow proposed by ChatDrug (https://github.com/chao1224/ChatDrug). We have now updated the relevant...
Thank you for your attention to our work! For the pip version of the requirement file, please refer to the requirements.txt on the main page. We have tested the availability...
Hi @DonaldDai We have tested the yaml file under CUDA 12.4 version and were able to successfully install the required environment for the experiment and run the related code. Therefore,...
Hi @DonaldDai , we have tested on a Linux server with the same CUDA version (12.4) as yours, and after configuring the environment with the YAML file we provided earlier,...