aduan

Results 3 comments of aduan

怎么修改_gene_signature才可以提取呢,谢谢。

@Hzzhang-nlp if cuda is 12.x, can install pytorch 12.1 from nightly and install flash-attention from source ``` pip install --pre torch torchvision torchaudio --index-url https://download.pytorch.org/whl/nightly/cu121 git clone https://github.com/HazyResearch/flash-attention.git python setup.py...

> https://github.com/eisenxp/macos-golink-wrapper may help. It may be the best solution of this problem.