binary-face-alignment
binary-face-alignment copied to clipboard
conversion from bnn.SpatialConvolution and bnn.binary to nn or cudnn
I was trying to convert this model into CPU supported device(nn), was able to convert some of the layers using cudnn.convert, but getting stuck at layers from bnn.torch, has anyone converted bnn layers to nn or cudnn. any help would be appreciated. Thank you
I don't think that converting layers in bnn.torch to a cpu layer is going to work without some effort. If you look at the implementation of the layers in bnn.torch, they are implemented in Nvidia CUDA. You are going to have to write the corresponding cpu code yourself and use this function to swap it out:
def replace_module(module, check_fn, create_fn):
if not hasattr(module, 'modules'):
return
if module.modules is None:
return
for i in range(len(module.modules)):
m = module.modules[i]
if check_fn(m):
module.modules[i] = create_fn(m)
replace_module(m, check_fn, create_fn)
Thank you for reply btomtom5. I understand that, But since the base code for bnn.torch is written in Nvidia Cuda. Can I atleast convert it to "cudnn." So then I can use cudnn.torch to convert it to equivalent nn
Hmmm. I don't think that works because the lua convert
function relies on the destination module nn
in this case to have the corresponding classes SpatialConvolution
and Binary
. nn does have a SpatialConvolution
defined but it needs to decode the weights before it can do the convolution. bnn.SpatialConvolution encodes the 32 weights of 0 and 1 as a single 32 bit integer.
Why do you want to run it in cpu only mode? I suggest renting out an ec2 instance with gpu support.
Yeah. So the only alternative would be writing a CPU equivalent of bnn.torch?
That's correct, but you only need to write the forward pass of it.
- decode the weights from int32 to binary
- perform regular spatial convolution from the binary weights || perform bitwise operation which is much more performant.
After you do this, you should search through the loaded module and replace them with your newly written module using the code I shared above. You also have to make sure that the weights are migrated over though.