universal-sentence-encoder-fine-tune icon indicating copy to clipboard operation
universal-sentence-encoder-fine-tune copied to clipboard

will this also work for "https://tfhub.dev/google/universal-sentence-encoder/2"

Open omerarshad opened this issue 6 years ago • 4 comments

will this finetuning method also work for transformer based encoder available at : https://tfhub.dev/google/universal-sentence-encoder/2 ?

omerarshad avatar Jun 21 '18 08:06 omerarshad

@omerarshad I think the link you posted here is again a DAN based encoder. In version 2 I think they have made the variables trainable so I think you no longer need this work around.

helloeve avatar Jun 21 '18 13:06 helloeve

oops i was talking about "https://tfhub.dev/google/universal-sentence-encoder-large/2" i.e. transformer based

omerarshad avatar Jun 21 '18 13:06 omerarshad

@omerarshad The transformer based one is also trainable as they mentioned in the document. But if it turns out to be not trainable, you should be able to use this work around.

helloeve avatar Jun 21 '18 13:06 helloeve

Great, So what is the best way to fine tune trainable transformer based USE?

omerarshad avatar Jun 21 '18 13:06 omerarshad