Rining Wu
Rining Wu
@yhanwen https://github.com/thinkjs/thinkjs/issues/997 可以帮我看一下这个issues吗?这个问题跟你遇到的可能很类似,我用了think-session-mysql。
Hi, same here. - python 3.8.17 - torch 1.12.1 - joblib 1.3.2 Importing joblib before torch can resolve it.
@tomMoral Thank you for this ref. It is useful. The `llvm-openmp` version is 16.0.6, and I can set `KMP_AFFINITY=disabled` to make it run well as well. https://github.com/pytorch/pytorch/issues/99625#issuecomment-1609894707
@Fazziekey Hi, I have tried many ways to let ColossalAI run as a lightning plugin. But I cannot success and finally get the error: ImportError: Please install colossalai from source...
@Fazziekey Following the last step, I re-do Method 6 and successfully involve the ColossalAI and the HybridAdam. But I got a wired error, as below: ```Bash GPU available: True (cuda),...
https://github.com/hpcaitech/ColossalAI/issues/2114#issuecomment-1347702320 Setting `placement_policy="cuda"` seems to work. `ColossalAIStrategy(enable_distributed_storage=False, placement_policy="cuda")`
@flynnamy In my experience, four things you need to check. 1. install the correct version of pytorch; 2. use `nvcc -V` to ensure your cuda and cudatoolkit version are correct...