text icon indicating copy to clipboard operation
text copied to clipboard

combining TEXT.build_vocab with BERT Embedding

Open muhalfian opened this issue 10 months ago • 0 comments

❓ Questions and Help

Description

Hi, we can use glove embedding when building vocab, using something like:

MIN_FREQ = 2

TEXT.build_vocab(train_data, 
                 min_freq = MIN_FREQ,
                 vectors = "glove.6B.300d",
                 unk_init = torch.Tensor.normal_)

However, I want to use BERT embedding because I need a sophisticated model to compare the performance of multiple embeddings. How can I use BERT in build_vocab?

muhalfian avatar Jan 27 '25 02:01 muhalfian