universal-sentence-encoder-fine-tune
universal-sentence-encoder-fine-tune copied to clipboard
will this also work for "https://tfhub.dev/google/universal-sentence-encoder/2"
will this finetuning method also work for transformer based encoder available at : https://tfhub.dev/google/universal-sentence-encoder/2 ?
@omerarshad I think the link you posted here is again a DAN based encoder. In version 2 I think they have made the variables trainable so I think you no longer need this work around.
oops i was talking about "https://tfhub.dev/google/universal-sentence-encoder-large/2" i.e. transformer based
@omerarshad The transformer based one is also trainable as they mentioned in the document. But if it turns out to be not trainable, you should be able to use this work around.
Great, So what is the best way to fine tune trainable transformer based USE?