fairseq icon indicating copy to clipboard operation
fairseq copied to clipboard

omegaconf.errors.ConfigAttributeError: Key 'autoregressive' not in 'HubertCtcConfig'

Open dohe0342 opened this issue 2 years ago • 2 comments

🐛 Bug

Hi, When I tried to load a hubert model, I got this error:

Traceback (most recent call last):  
  File "<stdin>", line 1, in <module>  
  File "/home/work/workspace/fairseq_origin/fairseq/checkpoint_utils.py", line 484, in load_model_ensemble_and_task    
    model = task.build_model(cfg.model, from_checkpoint=True)  
  File "/home/work/workspace/fairseq_origin/fairseq/tasks/fairseq_task.py", line 354, in build_model    
    model = models.build_model(cfg, self, from_checkpoint)  
  File "/home/work/workspace/fairseq_origin/fairseq/models/__init__.py", line 106, in build_model    
    return model.build_model(cfg, task)  
  File "/home/work/workspace/fairseq_origin/fairseq/models/hubert/hubert_asr.py", line 163, in build_model    
    w2v_encoder = HubertEncoder(cfg, task)  
  File "/home/work/workspace/fairseq_origin/fairseq/models/hubert/hubert_asr.py", line 384, in __init__    
    if task.target_dictionary is not None and not cfg.autoregressive:  
  File "/home/work/.local/lib/python3.8/site-packages/omegaconf/dictconfig.py", line 305, in __getattr__    
    self._format_and_raise(key=key, value=None, cause=e)  
  File "/home/work/.local/lib/python3.8/site-packages/omegaconf/base.py", line 95, in _format_and_raise    
    format_and_raise(  
  File "/home/work/.local/lib/python3.8/site-packages/omegaconf/_utils.py", line 629, in format_and_raise    
    _raise(ex, cause)  
  File "/home/work/.local/lib/python3.8/site-packages/omegaconf/_utils.py", line 610, in _raise    
    raise ex  # set end OC_CAUSE=1 for full backtrace  
  File "/home/work/.local/lib/python3.8/site-packages/omegaconf/dictconfig.py", line 303, in __getattr__    
    return self._get_impl(key=key, default_value=DEFAULT_VALUE_MARKER)  
  File "/home/work/.local/lib/python3.8/site-packages/omegaconf/dictconfig.py", line 361, in _get_impl    
    node = self._get_node(key=key)  
  File "/home/work/.local/lib/python3.8/site-packages/omegaconf/dictconfig.py", line 383, in _get_node    
    self._validate_get(key)  
  File "/home/work/.local/lib/python3.8/site-packages/omegaconf/dictconfig.py", line 135, in _validate_get    
    self._format_and_raise(  
  File "/home/work/.local/lib/python3.8/site-packages/omegaconf/base.py", line 95, in _format_and_raise    
    format_and_raise(
  File "/home/work/.local/lib/python3.8/site-packages/omegaconf/_utils.py", line 694, in format_and_raise
    _raise(ex, cause)
  File "/home/work/.local/lib/python3.8/site-packages/omegaconf/_utils.py", line 610, in _raise
    raise ex  # set end OC_CAUSE=1 for full backtrace
omegaconf.errors.ConfigAttributeError: Key 'autoregressive' not in 'HubertCtcConfig'
        full_key: autoregressive
        reference_type=Optional[HubertCtcConfig]
        object_type=HubertCtcConfig

It seems that 'HubertCtcConfig' does not have key named 'autoregressive' but 'AudioFinetuningConfig' have. So, I add 'autoregressive' key in 'fairseq/models/hubert/hubert_asr.py' in my PR.

To Reproduce

models, cfg, task = fairseq.checkpoint_utils.load_model_ensemble_and_task([f'{model_path}/hubert_xtralarge_ll60k_finetune_ls960.pt'])

Expected behavior

Load model from checkpoint.

Environment

  • fairseq Version (e.g., 1.0 or main): 0.12.2 (main)
  • PyTorch Version (e.g., 1.0): 1.13.0+cu117
  • OS (e.g., Linux): Linux
  • How you installed fairseq (pip, source): pip
  • Build command you used (if compiling from source): pip install --editable ./
  • Python version: 3.8.13
  • CUDA/cuDNN version: CUDA 11.7
  • GPU models and configuration: A100

dohe0342 avatar Dec 22 '22 04:12 dohe0342

@dohe0342 hello, I have met the same problem and I want to how you solve the problem? Only adding autoregressive option to HubertAsrConfig? Have your finetune get a good result? : ) Thank you very much

ZhikangNiu avatar Jan 13 '23 03:01 ZhikangNiu

Hello. I just runt into the same problem. Any update on this? I guess the fix is to add the autoregressive flag?

serchsm avatar Oct 31 '23 14:10 serchsm