BioGPT icon indicating copy to clipboard operation
BioGPT copied to clipboard

AttributeError: module 'omegaconf._utils' has no attribute 'is_primitive_type'. Did you mean: 'is_primitive_dict'?

Open ZON-ZONG-MIN opened this issue 2 years ago • 2 comments

I want to try using BioGPT and I followed the instructions to install the required tools (Moses、fastBPE、sacremoses、sklearn) Then I run the following code

import torch
from fairseq.models.transformer_lm import TransformerLanguageModel
m = TransformerLanguageModel.from_pretrained(
        "checkpoints/Pre-trained-BioGPT", 
        "checkpoint.pt", 
        "data",
        tokenizer='moses', 
        bpe='fastbpe', 
        bpe_codes="data/bpecodes",
        min_len=100,
        max_len_b=1024)
m.cuda()
src_tokens = m.encode("COVID-19 is")
generate = m.generate([src_tokens], beam=5)[0]
output = m.decode(generate[0]["tokens"])
print(output)

but i get the following error

2023-03-29 14:46:39 | INFO | fairseq.file_utils | loading archive file checkpoints/Pre-trained-BioGPT
2023-03-29 14:46:39 | INFO | fairseq.file_utils | loading archive file data
Traceback (most recent call last):
  File "/home/rufus/Desktop/biogpt/BioGPT/fairseq/pre-trained_model.py", line 3, in <module>
    m = TransformerLanguageModel.from_pretrained(
  File "/home/rufus/Desktop/biogpt/BioGPT/fairseq/fairseq/models/fairseq_model.py", line 267, in from_pretrained
    x = hub_utils.from_pretrained(
  File "/home/rufus/Desktop/biogpt/BioGPT/fairseq/fairseq/hub_utils.py", line 73, in from_pretrained
    models, args, task = checkpoint_utils.load_model_ensemble_and_task(
  File "/home/rufus/Desktop/biogpt/BioGPT/fairseq/fairseq/checkpoint_utils.py", line 421, in load_model_ensemble_and_task
    state = load_checkpoint_to_cpu(filename, arg_overrides)
  File "/home/rufus/Desktop/biogpt/BioGPT/fairseq/fairseq/checkpoint_utils.py", line 328, in load_checkpoint_to_cpu
    old_primitive = _utils.is_primitive_type
AttributeError: module 'omegaconf._utils' has no attribute 'is_primitive_type'. Did you mean: 'is_primitive_dict'?

Where should my pre-trained BioGPT model code be placed?

ZON-ZONG-MIN avatar Mar 29 '23 07:03 ZON-ZONG-MIN

I am also getting error with omegaconf. Did you find solution?

afy77777 avatar Jan 28 '24 12:01 afy77777

For anyone who is facing this issue, try to replace is_primitive_type with is_primitive_type_annotation

leminhnguyen avatar May 09 '24 02:05 leminhnguyen