Tied weights and TransposedCUDAMatrix
Hi,
I'm trying to modify the autoencoder model_layer1.pbtxt file to tie the weights from the hidden layer to the output layer to be the transpose of those from the input layer to the hidden layer. When I run this I get the error:
AttributeError: 'TransposedCUDAMatrix' object has no attribute 'shape'
Any ideas how to fix this? I tried making the TransposedCUDAMatrix be derived from CUDAMatrix instead of object, which fixed that problem but then died when trying to use the .T operator.
Thanks, Ron
Daniel,
- Could please tell me where you add this modification ? I am also interested in this. We could add a special config parameter like "tied_transpose_to:" or something like.
I've manged it. Will come back a bit later with a patch proposal, and with results over "ae" example. Will post it over issue #48.
- It is nonsense to tie a transposed bias/weight (e.g in case of autoencoder) to some other layer in the very learning process. However I manage it in the code in case of feed_forward_net, but not really usefull at all.