FaceFormer icon indicating copy to clipboard operation
FaceFormer copied to clipboard

transformers problem

Open lucasjinreal opened this issue 2 years ago • 8 comments

transformers/models/wav2vec2/modeling_wav2vec2.py", line 387, in forward
    hidden_states = hidden_states.transpose(1, 2)
AttributeError: 'tuple' object has no attribute 'transpose'

does it possible to make code compatible with latest transformers?

lucasjinreal avatar Jun 23 '22 07:06 lucasjinreal

Requirement already satisfied: six in /usr/lib/python3/dist-packages (from sacremoses->transformers==4.6.1) (1.16.0)
Building wheels for collected packages: tokenizers
  Building wheel for tokenizers (pyproject.toml) ... error
  error: subprocess-exited-with-error
  
  × Building wheel for tokenizers (pyproject.toml) did not run successfully.
  │ exit code: 1
  ╰─> [50 lines of output]
      running bdist_wheel
      running build
      running build_py
      creating build
      creating build/lib.linux-x86_64-3.10
      creating build/lib.linux-x86_64-3.10/tokenizers
      copying py_src/tokenizers/__init__.py -> build/lib.linux-x86_64-3.10/tokenizers
      creating build/lib.linux-x86_64-3.10/tokenizers/models
      copying py_src/tokenizers/models/__init__.py -> build/lib.linux-x86_64-3.10/tokenizers/models
      creating build/lib.linux-x86_64-3.10/tokenizers/decoders
      copying py_src/tokenizers/decoders/__init__.py -> build/lib.linux-x86_64-3.10/tokenizers/decoders
      creating build/lib.linux-x86_64-3.10/tokenizers/normalizers
      copying py_src/tokenizers/normalizers/__init__.py -> build/lib.linux-x86_64-3.10/tokenizers/normalizers
      creating build/lib.linux-x86_64-3.10/tokenizers/pre_tokenizers
      copying py_src/tokenizers/pre_tokenizers/__init__.py -> build/lib.linux-x86_64-3.10/tokenizers/pre_tokenizers
      creating build/lib.linux-x86_64-3.10/tokenizers/processors
      copying py_src/tokenizers/processors/__init__.py -> build/lib.linux-x86_64-3.10/tokenizers/processors
      creating build/lib.linux-x86_64-3.10/tokenizers/trainers
      copying py_src/tokenizers/trainers/__init__.py -> build/lib.linux-x86_64-3.10/tokenizers/trainers
      creating build/lib.linux-x86_64-3.10/tokenizers/implementations
      copying py_src/tokenizers/implementations/byte_level_bpe.py -> build/lib.linux-x86_64-3.10/tokenizers/implementations
      copying py_src/tokenizers/implementations/base_tokenizer.py -> build/lib.linux-x86_64-3.10/tokenizers/implementations
      copying py_src/tokenizers/implementations/sentencepiece_unigram.py -> build/lib.linux-x86_64-3.10/tokenizers/implementations
      copying py_src/tokenizers/implementations/sentencepiece_bpe.py -> build/lib.linux-x86_64-3.10/tokenizers/implementations
      copying py_src/tokenizers/implementations/bert_wordpiece.py -> build/lib.linux-x86_64-3.10/tokenizers/implementations
      copying py_src/tokenizers/implementations/__init__.py -> build/lib.linux-x86_64-3.10/tokenizers/implementations
      copying py_src/tokenizers/implementations/char_level_bpe.py -> build/lib.linux-x86_64-3.10/tokenizers/implementations
      creating build/lib.linux-x86_64-3.10/tokenizers/tools
      copying py_src/tokenizers/tools/visualizer.py -> build/lib.linux-x86_64-3.10/tokenizers/tools
      copying py_src/tokenizers/tools/__init__.py -> build/lib.linux-x86_64-3.10/tokenizers/tools
      copying py_src/tokenizers/__init__.pyi -> build/lib.linux-x86_64-3.10/tokenizers
      copying py_src/tokenizers/models/__init__.pyi -> build/lib.linux-x86_64-3.10/tokenizers/models
      copying py_src/tokenizers/decoders/__init__.pyi -> build/lib.linux-x86_64-3.10/tokenizers/decoders
      copying py_src/tokenizers/normalizers/__init__.pyi -> build/lib.linux-x86_64-3.10/tokenizers/normalizers
      copying py_src/tokenizers/pre_tokenizers/__init__.pyi -> build/lib.linux-x86_64-3.10/tokenizers/pre_tokenizers
      copying py_src/tokenizers/processors/__init__.pyi -> build/lib.linux-x86_64-3.10/tokenizers/processors
      copying py_src/tokenizers/trainers/__init__.pyi -> build/lib.linux-x86_64-3.10/tokenizers/trainers
      copying py_src/tokenizers/tools/visualizer-styles.css -> build/lib.linux-x86_64-3.10/tokenizers/tools
      running build_ext
      error: can't find Rust compiler
      
      If you are using an outdated pip version, it is possible a prebuilt wheel is available for this package but pip is not able to install from it. Installing from the wheel would avoid the need for a Rust compiler.
      
      To update pip, run:
      
          pip install --upgrade pip
      
      and then retry package installation.
      
      If you did intend to build this package from source, try installing a Rust compiler from your system package manager and ensure it is on the PATH during installation. Alternatively, rustup (available at https://rustup.rs) is the recommended way to download and update the Rust compiler toolchain.
      [end of output]
  
  note: This error originates from a subprocess, and is likely not a problem with pip.
  ERROR: Failed building wheel for tokenizers
Failed to build tokenizers
ERROR: Could not build wheels for tokenizers, which is required to install pyproject.toml-based projects

lucasjinreal avatar Jun 23 '22 07:06 lucasjinreal

I also encountered this problem, is there a solution?Line 114 of wav2vec.py will change "hidden_state" from a tensor to a tuple, and the tuple has no transpose method, could that be the problem here?

JSHZT avatar Sep 09 '22 02:09 JSHZT

Have you solved this problem?

Baroquestc avatar Nov 10 '22 07:11 Baroquestc

I have the same problem, any solution?

eventhorizon02 avatar Nov 25 '22 13:11 eventhorizon02

I have the same problem, any solution?

Sorry I haven't solved this yet

JSHZT avatar Nov 25 '22 13:11 JSHZT

I think you can try to keep the version consistent with the author

JSHZT avatar Nov 25 '22 13:11 JSHZT

I had the same problem when I installed a wrong version of the transformer package. With the version recommended by the author it works fine.

Alpe6825 avatar Jan 17 '23 12:01 Alpe6825

change hidden_states = self.feature_projection(hidden_states) to hidden_states = self.feature_projection(hidden_states)[0]

it's useful on transformers==4.26.1

FinallyKiKi avatar Apr 15 '23 08:04 FinallyKiKi