flybird11111
flybird11111
> hi all, take a look at this please. This bug is quite annoying for me. > > #6168 ok
Hello, please delete ~/.cache/colossalai, and then reinstall ColossalAI using `pip install -e .` .
You can just pass the lr attribute when creating a HybridAdam object. Pass different lr values into different parameter groups like adam.
Hello, may I ask which branch you are using? Please use the support-npu branch.
Hello, ColossalAI's TP and PP strategies do not support Flux. You may try using the Gemini strategy, but please note that the Gemini strategy does not support LoRA training.No, ColossalAI...
In OpenSora, there are customized policies that need to be implemented by themself. The models that have already been implemented can be found in the README of https://github.com/hpcaitech/ColossalAI.
Zero Optimizer usually updates parameters using float32. Not using float32 may lead to unstable training.
您好,您可以试试早期版本