phrase-bert-topic-model
phrase-bert-topic-model copied to clipboard
Model encoding error
Hello, I followed the instructions, but I encountered error at this step:
phrase_list = [ 'play an active role', 'participate actively', 'active lifestyle']
phrase_embs = model.encode( phrase_list )
The error seems to do with passing features
to BERT.
The entire error message is printed below:
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "sentence-transformers/sentence_transformers/SentenceTransformer.py", line 187, in encode
out_features = self.forward(features)
File "anaconda3/envs/phrasebert/lib/python3.8/site-packages/torch/nn/modules/container.py", line 141, in forward
input = module(input)
File "anaconda3/envs/phrasebert/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1102, in _call_impl
return forward_call(*input, **kwargs)
File "sentence-transformers/sentence_transformers/models/BERT.py", line 33, in forward
output_states = self.bert(**features)
File "anaconda3/envs/phrasebert/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1102, in _call_impl
return forward_call(*input, **kwargs)
File "anaconda3/envs/phrasebert/lib/python3.8/site-packages/transformers/models/bert/modeling_bert.py", line 950, in forward
batch_size, seq_length = input_shape
ValueError: not enough values to unpack (expected 2, got 1)
hey can you check the version of your sentence-transformers
?
import sentence_transformers
print(sentence_transformers.__version__)
Please use v0.3.3 and you may refer to the set-up section
I have the correct version, which is 0.3.3. Is it possible that I set the model path wrong? I unzipped the model, as written in the directions, and set the model path as follows:
model_path = 'phrase-bert-model/pooled_context_para_triples_p=0.8'
Try to install lower version transformers package.
pip install transformers==3.0.2 sentence_transformers==0.3.3
Hi Michelle and I think Ankur is right. I don't think this is an error with your model path.
Instead sentence-BERT v0.3.3 is built on specific versions of the huggingface's transformers (v3.0.2). You may check your transformers version similarly by:
import transformers
print( transformers.__version__)
Installing lower version of transformers fixed the problem. Thank you!