universal-sentence-encoder-fine-tune icon indicating copy to clipboard operation
universal-sentence-encoder-fine-tune copied to clipboard

Trainable model size doubles

Open akshitj1 opened this issue 6 years ago • 3 comments

hi @helloeve , I was trying to use this model with module wrapper but it gives error that "graph def should be less than 2GB". While further inspecting the issue I noted that output of convert_use is 1.9GB while the original model is ~ 812 MB. Do you happen to know why this might be happening? we are not adding any new tensors or operations in graph copy.

akshitj1 avatar Jun 01 '18 10:06 akshitj1

Hi @akshitj1 , thank you for pointing this out. I didn't realize this doubled model size previously. My initial guess is that the original non-trainable variables get exported as well. I will need to double check and see if I could purge the model to the original size.

helloeve avatar Jun 01 '18 13:06 helloeve

Hi @helloeve , I saved both g1 and g2 using tf.train.Saver().save(). g1 had .meta file of 8.3 mb while g2 of 994M. '.data' files are of same size for both. meta file is just the graph def. can this grow due to trainable vars?

akshitj1 avatar Jun 04 '18 09:06 akshitj1

@akshitj1 Yes that might be the issue. I just noticed that they have released the trainable version of USE so maybe we should try that and see if it could save us some space?

helloeve avatar Jun 21 '18 13:06 helloeve