spikingjelly
spikingjelly copied to clipboard
Bug:"CuPy is not installed!", even I have installed cupy-cuda12x with spikingjelly 0.0.0.0.14.
Read before creating a new issue
- Users who want to use SpikingJelly should first be familiar with the usage of PyTorch.
- If you do not know much about PyTorch, we recommend that the user can learn the basic tutorials of PyTorch.
- Do not ask for help with the basic conception of PyTorch/Machine Learning but not related to SpikingJelly. For these questions, please refer to Google or PyTorch Forums.
For faster response
You can @ the corresponding developers for your issue. Here is the division:
Features | Developers |
---|---|
Neurons and Surrogate Functions | fangwei123456 Yanqi-Chen |
CUDA Acceleration | fangwei123456 Yanqi-Chen |
Reinforcement Learning | lucifer2859 |
ANN to SNN Conversion | DingJianhao Lyu6PosHao |
Biological Learning (e.g., STDP) | AllenYolk |
Others | Grasshlw lucifer2859 AllenYolk Lyu6PosHao DingJianhao Yanqi-Chen fangwei123456 |
We are glad to add new developers who are volunteering to help solve issues to the above table.
Issue type
- [x] Bug Report
- [ ] Feature Request
- [ ] Help wanted
- [ ] Other
SpikingJelly version
0.0.0.0.14
Description
..I have already tried to rebuild the whole environment with torch version 1.8.0 with cuda 11.1, but when I tried to build a MultiStepLIFNode with cupy backend, it still shows that I have not installed cupy.
Minimal code to reproduce the error/bug
import spikingjelly
# ...
AssertionError: CuPy is not installed! You can install it from "https://github.com/cupy/cupy".
Here is a part of my environment packages list. This is so wired that because I have tried to build MultiStepLIFNode before and it runs good with cupy backend.
Hi, please install SpikingJelly from GitHub (master version) and check if it solves this problem.
Thanks, Dr, Fang. Finally, after I install the master version from github, it works with module activation_based.neuron LIFNode, and it is so wired that then I reinstall the old version, everything is good also with MultiStepLIFneuron.
Issue type
- [ ] Help wanted
Description
I have already tried to rebuild the whole environment with torch version 1.13.0 with cuda 11.6, but when I tried to build a MultiStepLIFNode with cupy backend, it still shows that I have not installed cupy.
Conda list
Code
Error
Version NVIDIA GeForce RTX 4090
- "import cupy" is fine.
- "change to the latest spikingjelly" doesnot work.
I had tried everything I can do, but it did not work, my god. @fangwei123456
@QiWang233 Hi, you can set logging level as DEBUG (https://docs.python.org/3/library/logging.html#logging.DEBUG) and check the outputs.
Meanwhile, you can also try to install the latest version from GitHub. Note that the MultiStepLIFNode
is removed in the latest version. You can use LIFNode
with step_mode = 'm'
.
thanks for your reply! After changing the logging.debug and installing the latest version, I get questions below. It can be seen that it is because of the lack of "lava" , so I wonder how can I install the "lava" correctly.
Hi, lava is not easy to be installed... I suggest that you can change the source codes to avoid this problem (just ignore lava). Or you can install the latest version from github (we have fixed this problem).
Thanks for your patience!This is very important to me. But after changing to the lastest version 0.0.0.0.15,still the same problem.
import torch
import logging
logging.basicConfig(level=logging.DEBUG,
format='%(asctime)s - %(filename)s[line:%(lineno)d] - %(levelname)s: %(message)s')
try:
import cupy
except BaseException as e:
cupy = None
assert cupy is not None
print(cupy)
import spikingjelly
print(spikingjelly)
from spikingjelly.activation_based import neuron
lif = neuron.LIFNode(step_mode='m', backend='cupy').to('cuda:3')
x = torch.randn(10, 3, 224, 224).to('cuda:3')
out = lif(x)
print(out.shape)
I do not know why...
Set device as cuda:0
and try again?
Sorry, it does not work...it is so weird.
Thank you! After changing to a device with 3090(4090->3090), it is solved. Hope my experience can help the other! But I still do not know why 4090 have problem, maybe it is related to the "High Version".