universal-sentence-encoder-fine-tune
universal-sentence-encoder-fine-tune copied to clipboard
Tensorflow versions: ``` tensorflow 1.15.2 tensorflow-estimator 1.15.1 tensorflow-hub 0.9.0 tensorflow-text 1.15.1 ``` I'm running the code as is and I'm consistently running into the following error: ``` /tmp/ipykernel_1454/3669827002.py in main()...
I've been trying to run the code on Google Colab and my local computer, both using Tensorlow 1.15. When trying to graph the sentences before fine-tuning, the following error code:...
How to add a new dataset for finetuning?
I want to fine tune but i dont have labels
which version of tensorflow and tensorflow_hub does the code require ? Thanks!
**I saved the trained model as follows (similar to how u do in convert_use.py)** `tf.saved_model.simple_save(sess, save_path, inputs={'input':in_tensor}, outputs={'output':ou_tensor},legacy_init_op=tf.tables_initializer())` **But when I load the resulting model in Java, I get the...
will this finetuning method also work for transformer based encoder available at : https://tfhub.dev/google/universal-sentence-encoder/2 ?
how can we use sentence encoder as siamese architecture to pass two sentences as input?
hi @helloeve , I was trying to use this model with module wrapper but it gives error that "graph def should be less than 2GB". While further inspecting the issue...
Thanks for the contribution. As i start fine tuning, my ram utilization goes upto 10Gb, what might be the cause?