Manuel030

Results 12 comments of Manuel030

Has someone found a fix for the original problem?

I tried both. This results in cryptic error messages: ``` --- stderr fatal: not a git repository (or any parent up to mount point /) Stopping at filesystem boundary (GIT_DISCOVERY_ACROSS_FILESYSTEM...

Thanks @Rocketknight1, I don't have numpy available when executing in graph mode. I would expect the generate pass to be compatible with tensorflow's graph execution mode.

Also, the doc string states it should be a tf.Tensor.

Sure: ``` File "/home/manuel/Projects/whisper-finetune/issue.py", line 14, in outputs = model.generate( File "/home/manuel/Projects/whisper-finetune/venv/lib/python3.10/site-packages/transformers/models/whisper/modeling_tf_whisper.py", line 1646, in generate prompt_ids = prompt_ids.tolist() File "/home/manuel/Projects/whisper-finetune/venv/lib/python3.10/site-packages/tensorflow/python/framework/ops.py", line 440, in __getattr__ raise AttributeError( AttributeError: EagerTensor object...

Unfortunately, patching the high-level `generate` as suggested by @Rocketknight1 is not successful. My use case is an export to the tflite format.

Hi @sawantkumar, sure, you can replicate the performance difference with the official tensorflow lite example from here: https://github.com/tensorflow/tensorflow/tree/master/tensorflow/lite/examples/python The example runs inference on a mobilenet model for an example image....

I've compiled via: ``` Bazel Build Command: "bazel" "--output_base=/target/x86_64-unknown-linux-gnu/release/build/tflitec-d675effdd7095bee/out/tensorflow_v2.9.1_output_base" "build" "-c" "opt" "--config=linux" "//tensorflow/lite/c/tmp:tensorflowlite_c" "--copt=-O3" ```

It is a use case for me since I distribute software for mobile and desktop with a dependency on libtensorflowlite_c and have to distribute the shared library too. Also, the...