universal-sentence-encoder-fine-tune
universal-sentence-encoder-fine-tune copied to clipboard
Trainable model size doubles
hi @helloeve , I was trying to use this model with module wrapper but it gives error that "graph def should be less than 2GB". While further inspecting the issue I noted that output of convert_use is 1.9GB while the original model is ~ 812 MB. Do you happen to know why this might be happening? we are not adding any new tensors or operations in graph copy.
Hi @akshitj1 , thank you for pointing this out. I didn't realize this doubled model size previously. My initial guess is that the original non-trainable variables get exported as well. I will need to double check and see if I could purge the model to the original size.
Hi @helloeve , I saved both g1 and g2 using tf.train.Saver().save(). g1 had .meta file of 8.3 mb while g2 of 994M. '.data' files are of same size for both. meta file is just the graph def. can this grow due to trainable vars?
@akshitj1 Yes that might be the issue. I just noticed that they have released the trainable version of USE so maybe we should try that and see if it could save us some space?