cudnn.torch
cudnn.torch copied to clipboard
Is it possible to convert a GPU pre-trained model to CPU without cudnn?
Since the limited computer resource, I only have a CPU-only computer. However, some pre-trained model is based on GPU.
It seems that the torch installation will not contain cudnn when you don't have GPU on your computer. Therefore, I have no cudnn class in my torch. Moreover, there is no cudnn to nn conversion in the README.md.
Is there a way to convert a GPU pre-trained model to a CPU pre-trained model without using cudnn.torch?
cudnn.convert(model, nn)
works fine at least when you have gpu.
We have observed that converting using cudnn.convert doesn't work for all modules, for example cudnn.ClippedReLU doesn't get translated into nn despite mention of API compatibility.
cudnn.TemporalConvolution is also not converted, since it's a wrapper on cudnn.SpatialConvolution.
Hi could you find a way?
Thanks, Mina
@notimesea Does the cudnn.convert()
method work if we convert a GPU-based model to a CPU-based model using a machine which has GPU support and then load this converted model on a CPU-only machine?
@soumith, please help. I'm having the same error. I don't know why when I call graph.dot(final_model.fg, 'model', 'model') torch crashes.