aibrix icon indicating copy to clipboard operation
aibrix copied to clipboard

Failed to import transformers.generation.utils because of the following error

Open Jeffwan opened this issue 7 months ago • 1 comments

🐛 Describe the bug

./benchmark.sh all
+ CONFIG_FILE=config/base.sh
+ [[ -f config/base.sh ]]
+ echo '[INFO] Loading configuration from config/base.sh'
[INFO] Loading configuration from config/base.sh
+ source config/base.sh
++ export MODEL_NAME=deepseek-ai/deepseek-llm-7b-chat
++ MODEL_NAME=deepseek-ai/deepseek-llm-7b-chat
++ export TOKENIZER=deepseek-ai/deepseek-llm-7b-chat
++ TOKENIZER=deepseek-ai/deepseek-llm-7b-chat
++ export DATASET_DIR=./output/dataset/
++ DATASET_DIR=./output/dataset/
++ export PROMPT_TYPE=synthetic_multiturn
++ PROMPT_TYPE=synthetic_multiturn
++ export DATASET_FILE=./output/dataset//synthetic_multiturn.jsonl
++ DATASET_FILE=./output/dataset//synthetic_multiturn.jsonl
++ export WORKLOAD_TYPE=synthetic
++ WORKLOAD_TYPE=synthetic
++ export INTERVAL_MS=1000
++ INTERVAL_MS=1000
++ export DURATION_MS=300000
++ DURATION_MS=300000
++ export WORKLOAD_DIR=./output/workload/synthetic
++ WORKLOAD_DIR=./output/workload/synthetic
++ export WORKLOAD_FILE=./output/workload/synthetic/workload.jsonl
++ WORKLOAD_FILE=./output/workload/synthetic/workload.jsonl
++ export CLIENT_OUTPUT=./output/client_output
++ CLIENT_OUTPUT=./output/client_output
++ export ENDPOINT=http://localhost:8888
++ ENDPOINT=http://localhost:8888
++ export API_KEY=
++ API_KEY=
++ export TARGET_MODEL=llama-3-8b-instruct
++ TARGET_MODEL=llama-3-8b-instruct
++ export STREAMING_ENABLED=true
++ STREAMING_ENABLED=true
++ export CLIENT_POOL_SIZE=16
++ CLIENT_POOL_SIZE=16
++ export OUTPUT_TOKEN_LIMIT=128
++ OUTPUT_TOKEN_LIMIT=128
++ export TRACE_OUTPUT=./output/trace_analysis
++ TRACE_OUTPUT=./output/trace_analysis
++ export GOODPUT_TARGET=tpot:0.5
++ GOODPUT_TARGET=tpot:0.5
+ mkdir -p ./output/dataset/ ./output/workload/synthetic ./output/client_output ./output/trace_analysis
+ echo '========== Starting Benchmark =========='
========== Starting Benchmark ==========
+ COMMAND=all
+ case "$COMMAND" in
+ generate_dataset
+ echo '[INFO] Generating synthetic dataset synthetic_multiturn...'
[INFO] Generating synthetic dataset synthetic_multiturn...
+ case "$PROMPT_TYPE" in
+ source config/dataset/synthetic_multiturn.sh
++ export PROMPT_LENGTH=100
++ PROMPT_LENGTH=100
++ export PROMPT_STD=10
++ PROMPT_STD=10
++ export NUM_TURNS=10
++ NUM_TURNS=10
++ export NUM_TURNS_STD=1
++ NUM_TURNS_STD=1
++ export NUM_SESSIONS=10
++ NUM_SESSIONS=10
++ export NUM_SESSIONS_STD=1
++ NUM_SESSIONS_STD=1
+ python generator/dataset-generator/multiturn_prefix_sharing_dataset.py --prompt-length-mean 100 --prompt-length-std 10 --num-turns-mean 10 --num-turns-std 1 --num-sessions-mean 10 --num-sessions-std 1 --output ./output/dataset//synthetic_multiturn.jsonl

A module that was compiled using NumPy 1.x cannot be run in
NumPy 2.2.5 as it may crash. To support both 1.x and 2.x
versions of NumPy, modules must be compiled with NumPy 2.0.
Some module may need to rebuild instead e.g. with 'pybind11>=2.12'.

If you are a user of the module, the easiest solution will be to
downgrade to 'numpy<2' or try to upgrade the affected module.
We expect that some modules will need time to support NumPy 2.

Traceback (most recent call last):  File "/home/ubuntu/aibrix/benchmarks/generator/dataset-generator/multiturn_prefix_sharing_dataset.py", line 4, in <module>
    from transformers import AutoTokenizer
  File "/home/ubuntu/.local/lib/python3.10/site-packages/transformers/__init__.py", line 26, in <module>
    from . import dependency_versions_check
  File "/home/ubuntu/.local/lib/python3.10/site-packages/transformers/dependency_versions_check.py", line 16, in <module>
    from .utils.versions import require_version, require_version_core
  File "/home/ubuntu/.local/lib/python3.10/site-packages/transformers/utils/__init__.py", line 25, in <module>
    from .chat_template_utils import DocstringParsingException, TypeHintParsingException, get_json_schema
  File "/home/ubuntu/.local/lib/python3.10/site-packages/transformers/utils/chat_template_utils.py", line 40, in <module>
    from torch import Tensor
  File "/usr/lib/python3/dist-packages/torch/__init__.py", line 2222, in <module>
    from torch import quantization as quantization  # usort: skip
  File "/usr/lib/python3/dist-packages/torch/quantization/__init__.py", line 2, in <module>
    from .fake_quantize import *  # noqa: F403
  File "/usr/lib/python3/dist-packages/torch/quantization/fake_quantize.py", line 10, in <module>
    from torch.ao.quantization.fake_quantize import (
  File "/usr/lib/python3/dist-packages/torch/ao/quantization/__init__.py", line 12, in <module>
    from .pt2e._numeric_debugger import (  # noqa: F401
  File "/usr/lib/python3/dist-packages/torch/ao/quantization/pt2e/_numeric_debugger.py", line 9, in <module>
    from torch.export import ExportedProgram
  File "/usr/lib/python3/dist-packages/torch/export/__init__.py", line 68, in <module>
    from .decomp_utils import CustomDecompTable
  File "/usr/lib/python3/dist-packages/torch/export/decomp_utils.py", line 5, in <module>
    from torch._export.utils import (
  File "/usr/lib/python3/dist-packages/torch/_export/__init__.py", line 48, in <module>
    from .wrappers import _wrap_submodules
  File "/usr/lib/python3/dist-packages/torch/_export/wrappers.py", line 7, in <module>
    from torch._higher_order_ops.strict_mode import strict_mode
  File "/usr/lib/python3/dist-packages/torch/_higher_order_ops/__init__.py", line 1, in <module>
    from torch._higher_order_ops.cond import cond
  File "/usr/lib/python3/dist-packages/torch/_higher_order_ops/cond.py", line 9, in <module>
    import torch._subclasses.functional_tensor
  File "/usr/lib/python3/dist-packages/torch/_subclasses/functional_tensor.py", line 45, in <module>
    class FunctionalTensor(torch.Tensor):
  File "/usr/lib/python3/dist-packages/torch/_subclasses/functional_tensor.py", line 275, in FunctionalTensor
    cpu = _conversion_method_template(device=torch.device("cpu"))
/usr/lib/python3/dist-packages/torch/_subclasses/functional_tensor.py:275: UserWarning: Failed to initialize NumPy: _ARRAY_API not found (Triggered internally at ./torch/csrc/utils/tensor_numpy.cpp:81.)
  cpu = _conversion_method_template(device=torch.device("cpu"))
/usr/lib/python3/dist-packages/scipy/__init__.py:146: UserWarning: A NumPy version >=1.17.3 and <1.25.0 is required for this version of SciPy (detected version 2.2.5
  warnings.warn(f"A NumPy version >={np_minversion} and <{np_maxversion}"

A module that was compiled using NumPy 1.x cannot be run in
NumPy 2.2.5 as it may crash. To support both 1.x and 2.x
versions of NumPy, modules must be compiled with NumPy 2.0.
Some module may need to rebuild instead e.g. with 'pybind11>=2.12'.

If you are a user of the module, the easiest solution will be to
downgrade to 'numpy<2' or try to upgrade the affected module.
We expect that some modules will need time to support NumPy 2.

Traceback (most recent call last):  File "/home/ubuntu/aibrix/benchmarks/generator/dataset-generator/multiturn_prefix_sharing_dataset.py", line 4, in <module>
    from transformers import AutoTokenizer
  File "/home/ubuntu/.local/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1956, in __getattr__
    value = getattr(module, name)
  File "/home/ubuntu/.local/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1955, in __getattr__
    module = self._get_module(self._class_to_module[name])
  File "/home/ubuntu/.local/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1967, in _get_module
    return importlib.import_module("." + module_name, self.__name__)
  File "/usr/lib/python3.10/importlib/__init__.py", line 126, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "/home/ubuntu/.local/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py", line 38, in <module>
    from .auto_factory import _LazyAutoMapping
  File "/home/ubuntu/.local/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 40, in <module>
    from ...generation import GenerationMixin
  File "/home/ubuntu/.local/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1955, in __getattr__
    module = self._get_module(self._class_to_module[name])
  File "/home/ubuntu/.local/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1967, in _get_module
    return importlib.import_module("." + module_name, self.__name__)
  File "/usr/lib/python3.10/importlib/__init__.py", line 126, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "/home/ubuntu/.local/lib/python3.10/site-packages/transformers/generation/utils.py", line 30, in <module>
    from transformers.generation.candidate_generator import AssistantVocabTranslatorCache
  File "/home/ubuntu/.local/lib/python3.10/site-packages/transformers/generation/candidate_generator.py", line 27, in <module>
    from sklearn.metrics import roc_curve
  File "/usr/lib/python3/dist-packages/sklearn/__init__.py", line 80, in <module>
    from .base import clone
  File "/usr/lib/python3/dist-packages/sklearn/base.py", line 21, in <module>
    from .utils import _IS_32BIT
  File "/usr/lib/python3/dist-packages/sklearn/utils/__init__.py", line 20, in <module>
    from scipy.sparse import issparse
  File "/usr/lib/python3/dist-packages/scipy/sparse/__init__.py", line 267, in <module>
    from ._csr import *
  File "/usr/lib/python3/dist-packages/scipy/sparse/_csr.py", line 10, in <module>
    from ._sparsetools import (csr_tocsc, csr_tobsr, csr_count_blocks,
AttributeError: _ARRAY_API not found
Traceback (most recent call last):
  File "/home/ubuntu/.local/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1967, in _get_module
    return importlib.import_module("." + module_name, self.__name__)
  File "/usr/lib/python3.10/importlib/__init__.py", line 126, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "<frozen importlib._bootstrap>", line 1050, in _gcd_import
  File "<frozen importlib._bootstrap>", line 1027, in _find_and_load
  File "<frozen importlib._bootstrap>", line 1006, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 688, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 883, in exec_module
  File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
  File "/home/ubuntu/.local/lib/python3.10/site-packages/transformers/generation/utils.py", line 30, in <module>
    from transformers.generation.candidate_generator import AssistantVocabTranslatorCache
  File "/home/ubuntu/.local/lib/python3.10/site-packages/transformers/generation/candidate_generator.py", line 27, in <module>
    from sklearn.metrics import roc_curve
  File "/usr/lib/python3/dist-packages/sklearn/__init__.py", line 80, in <module>
    from .base import clone
  File "/usr/lib/python3/dist-packages/sklearn/base.py", line 21, in <module>
    from .utils import _IS_32BIT
  File "/usr/lib/python3/dist-packages/sklearn/utils/__init__.py", line 20, in <module>
    from scipy.sparse import issparse
  File "/usr/lib/python3/dist-packages/scipy/sparse/__init__.py", line 267, in <module>
    from ._csr import *
  File "/usr/lib/python3/dist-packages/scipy/sparse/_csr.py", line 10, in <module>
    from ._sparsetools import (csr_tocsc, csr_tobsr, csr_count_blocks,
ImportError: numpy.core.multiarray failed to import

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/home/ubuntu/.local/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1967, in _get_module
    return importlib.import_module("." + module_name, self.__name__)
  File "/usr/lib/python3.10/importlib/__init__.py", line 126, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "<frozen importlib._bootstrap>", line 1050, in _gcd_import
  File "<frozen importlib._bootstrap>", line 1027, in _find_and_load
  File "<frozen importlib._bootstrap>", line 1006, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 688, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 883, in exec_module
  File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
  File "/home/ubuntu/.local/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py", line 38, in <module>
    from .auto_factory import _LazyAutoMapping
  File "/home/ubuntu/.local/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 40, in <module>
    from ...generation import GenerationMixin
  File "<frozen importlib._bootstrap>", line 1075, in _handle_fromlist
  File "/home/ubuntu/.local/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1955, in __getattr__
    module = self._get_module(self._class_to_module[name])
  File "/home/ubuntu/.local/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1969, in _get_module
    raise RuntimeError(
RuntimeError: Failed to import transformers.generation.utils because of the following error (look up to see its traceback):
numpy.core.multiarray failed to import

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/home/ubuntu/aibrix/benchmarks/generator/dataset-generator/multiturn_prefix_sharing_dataset.py", line 4, in <module>
    from transformers import AutoTokenizer
  File "<frozen importlib._bootstrap>", line 1075, in _handle_fromlist
  File "/home/ubuntu/.local/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1956, in __getattr__
    value = getattr(module, name)
  File "/home/ubuntu/.local/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1955, in __getattr__
    module = self._get_module(self._class_to_module[name])
  File "/home/ubuntu/.local/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1969, in _get_module
    raise RuntimeError(
RuntimeError: Failed to import transformers.models.auto.tokenization_auto because of the following error (look up to see its traceback):
Failed to import transformers.generation.utils because of the following error (look up to see its traceback):
numpy.core.multiarray failed to import

Steps to Reproduce

git clone https://github.com/vllm-project/aibrix.git
cd aibrix/benchmarks
./benchmark.sh all
pip3 install transformers
./benchmark.sh all

failed again

Expected behavior

It should work without any errors.

Environment

Python 3.10

Jeffwan avatar Apr 30 '25 00:04 Jeffwan

This is probably my env issue. Just like to double whether other issue meet similar problems

Jeffwan avatar Apr 30 '25 16:04 Jeffwan