addons icon indicating copy to clipboard operation
addons copied to clipboard

Save from checkpoint won't work if evaluate is done in another script

Open thepurpleowl opened this issue 4 years ago • 0 comments

If we run the evaluation as a different script, then loading from checkpoint doesn't set all the weights to all layers. Only AttentionMechanism memory_layer is being set.

After checkpoint.restore(tf.train.latest_checkpoint(checkpoint_dir))

running decoder.variables does give only the following

[<tf.Variable 'BahdanauAttention/memory_layer/kernel:0' shape=(8, 8) dtype=float32, numpy=
array([[ 0.15743445,  0.03994462, -0.2883031 ,  0.39809996,  0.3215048 ,
        -0.46074408,  0.07891196, -0.02564841],
       [-0.35952342, -0.27737555, -0.50748783,  0.57785255, -0.08057628
         0.29438728,  0.46183833,  0.08846892],
       [-0.5546453 , -0.5423229 ,  0.02733934, -0.22643322, -0.14250489,
         0.04927453, -0.46095103,  0.52018607],
       [-0.42026058, -0.5158085 , -0.5829936 , -0.40332744, -0.24286492,
        -0.33631212,  0.49148753,  0.41896522],
       [-0.58512676, -0.45264915, -0.55898035, -0.49855614, -0.022393  ,
         0.2978394 , -0.08824155,  0.4237653 ],
       [ 0.3875127 ,  0.5466225 ,  0.460645  ,  0.46150202, -0.5859269 ,
        -0.60290945,  0.37809852, -0.1127392 ],
       [ 0.24443758,  0.08152801,  0.36199048, -0.21402001, -0.23801355,
         0.09853357, -0.00076995, -0.45100412],
       [ 0.05195995, -0.34752947,  0.42515457,  0.3826397 , -0.03966437,
         0.5022366 ,  0.47566518,  0.03214471]], dtype=float32)>]

Normal save() or save_model() doesn't help too. Is this something to do with tile_batch? How to instantiate other layers so that decoder can take up weights from checkpoint?

thepurpleowl avatar May 06 '21 05:05 thepurpleowl