insanely-fast-whisper-cli icon indicating copy to clipboard operation
insanely-fast-whisper-cli copied to clipboard

Crashed on my first try

Open ifuchs opened this issue 2 years ago • 1 comments

I installed it and got;

insanely-fast-whisper --model openai/whisper-base.en /Users/i/Desktop/Steve_Prince.wav Traceback (most recent call last): File "/opt/anaconda3/lib/python3.9/site-packages/transformers/utils/import_utils.py", line 1353, in _get_module return importlib.import_module("." + module_name, self.name) File "/opt/anaconda3/lib/python3.9/importlib/init.py", line 127, in import_module return _bootstrap._gcd_import(name[level:], package, level) File "", line 1030, in _gcd_import File "", line 1007, in _find_and_load File "", line 986, in _find_and_load_unlocked File "", line 680, in _load_unlocked File "", line 850, in exec_module File "", line 228, in _call_with_frames_removed File "/opt/anaconda3/lib/python3.9/site-packages/transformers/pipelines/init.py", line 28, in from ..image_processing_utils import BaseImageProcessor File "/opt/anaconda3/lib/python3.9/site-packages/transformers/image_processing_utils.py", line 28, in from .image_transforms import center_crop, normalize, rescale File "/opt/anaconda3/lib/python3.9/site-packages/transformers/image_transforms.py", line 47, in import tensorflow as tf File "/opt/anaconda3/lib/python3.9/site-packages/tensorflow/init.py", line 37, in from tensorflow.python.tools import module_util as _module_util File "/opt/anaconda3/lib/python3.9/site-packages/tensorflow/python/init.py", line 37, in from tensorflow.python.eager import context File "/opt/anaconda3/lib/python3.9/site-packages/tensorflow/python/eager/context.py", line 29, in from tensorflow.core.framework import function_pb2 File "/opt/anaconda3/lib/python3.9/site-packages/tensorflow/core/framework/function_pb2.py", line 16, in from tensorflow.core.framework import attr_value_pb2 as tensorflow_dot_core_dot_framework_dot_attr__value__pb2 File "/opt/anaconda3/lib/python3.9/site-packages/tensorflow/core/framework/attr_value_pb2.py", line 16, in from tensorflow.core.framework import tensor_pb2 as tensorflow_dot_core_dot_framework_dot_tensor__pb2 File "/opt/anaconda3/lib/python3.9/site-packages/tensorflow/core/framework/tensor_pb2.py", line 16, in from tensorflow.core.framework import resource_handle_pb2 as tensorflow_dot_core_dot_framework_dot_resource__handle__pb2 File "/opt/anaconda3/lib/python3.9/site-packages/tensorflow/core/framework/resource_handle_pb2.py", line 16, in from tensorflow.core.framework import tensor_shape_pb2 as tensorflow_dot_core_dot_framework_dot_tensor__shape__pb2 File "/opt/anaconda3/lib/python3.9/site-packages/tensorflow/core/framework/tensor_shape_pb2.py", line 36, in _descriptor.FieldDescriptor( File "/opt/anaconda3/lib/python3.9/site-packages/google/protobuf/descriptor.py", line 553, in new _message.Message._CheckCalledFromGeneratedFile() TypeError: Descriptors cannot be created directly. If this call came from a _pb2.py file, your generated code is out of date and must be regenerated with protoc >= 3.19.0. If you cannot immediately regenerate your protos, some other possible workarounds are:

  1. Downgrade the protobuf package to 3.20.x or lower.
  2. Set PROTOCOL_BUFFERS_PYTHON_IMPLEMENTATION=python (but this will use pure-Python parsing and will be much slower).

More information: https://developers.google.com/protocol-buffers/docs/news/2022-05-06#python-updates

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "/opt/anaconda3/bin/insanely-fast-whisper", line 5, in from insanely_fast_whisper.cli import main File "/opt/anaconda3/lib/python3.9/site-packages/insanely_fast_whisper/cli.py", line 4, in from transformers import pipeline File "", line 1055, in _handle_fromlist File "/opt/anaconda3/lib/python3.9/site-packages/transformers/utils/import_utils.py", line 1343, in getattr module = self._get_module(self._class_to_module[name]) File "/opt/anaconda3/lib/python3.9/site-packages/transformers/utils/import_utils.py", line 1355, in _get_module raise RuntimeError( RuntimeError: Failed to import transformers.pipelines because of the following error (look up to see its traceback): Descriptors cannot be created directly. If this call came from a _pb2.py file, your generated code is out of date and must be regenerated with protoc >= 3.19.0. If you cannot immediately regenerate your protos, some other possible workarounds are:

  1. Downgrade the protobuf package to 3.20.x or lower.
  2. Set PROTOCOL_BUFFERS_PYTHON_IMPLEMENTATION=python (but this will use pure-Python parsing and will be much slower).

More information: https://developers.google.com/protocol-buffers/docs/news/2022-05-06#python-updates

ifuchs avatar Nov 28 '23 22:11 ifuchs

You might have a dependency problem. You should try out my front end to insanely-fast-whisper transcribe-anything --device insane and see if that solves your problem. I handle dependencies differently to avoid issues like this. You don't need to use conda either to install it. You can just use pip install transcribe-anything and it will grab the GPU version of torch if the program detects that you have nvidia-smi installed.

zackees avatar Jan 13 '24 20:01 zackees