ColossalAI icon indicating copy to clipboard operation
ColossalAI copied to clipboard

[BUG]: No module named 'dropout_layer_norm'

Open apachemycat opened this issue 1 year ago • 2 comments

Is there an existing issue for this bug?

  • [x] #5795

🐛 Describe the bug

ModuleNotFoundError: No module named 'dropout_layer_norm' [2024-05-17 03:23:11,932] torch.distributed.elastic.multiprocessing.api: [ERROR] failed (exitcode: 1) local_rank: 0 (pid: 615) of binary: /usr/bin/python

dropout_layer_norm is depreated by flash_attn ,so If any other choise ?

Environment

No response

apachemycat avatar May 17 '24 03:05 apachemycat

Hi @apachemycat , would you mind sharing the version of flash_atten in your environment? I am using flash-attn==2.5.7 , looks all good. Also, you can replace dropout_layer_norm with torch.nn.functional.layer_norm & dropout, although kernel acceleration may not supported now.

duanjunwen avatar Jun 11 '24 09:06 duanjunwen

watching...

zhurunhua avatar Jul 18 '24 03:07 zhurunhua