FlagAI
FlagAI copied to clipboard
[Question]: Any requirements.txt for AquilaChat & AquilaCode
Description
from torch._six import inf ModuleNotFoundError: No module named 'torch._six'
Alternatives
No response
you can update to the latest version of flagai, this problem happens with torch>2.0
Thanks! But there is another problem with my new conda env only for flagai(When pip install -U flagai)
python=3.11
torch = 1.13.1
flagai=1.7.2
Is this tokenizers the huggingface tokenizers, can I use a pre build version from PyPI(which version)?
Didn't work with pip update and install https://rustup.rs/
Building wheels for collected packages: tokenizers
Building wheel for tokenizers (pyproject.toml) ... error
error: subprocess-exited-with-error
× Building wheel for tokenizers (pyproject.toml) did not run successfully.
│ exit code: 1
╰─> [51 lines of output]
running bdist_wheel
running build
running build_py
creating build
creating build/lib.linux-x86_64-cpython-311
creating build/lib.linux-x86_64-cpython-311/tokenizers
copying py_src/tokenizers/__init__.py -> build/lib.linux-x86_64-cpython-311/tokenizers
creating build/lib.linux-x86_64-cpython-311/tokenizers/models
copying py_src/tokenizers/models/__init__.py -> build/lib.linux-x86_64-cpython-311/tokenizers/models
creating build/lib.linux-x86_64-cpython-311/tokenizers/decoders
copying py_src/tokenizers/decoders/__init__.py -> build/lib.linux-x86_64-cpython-311/tokenizers/decoders
creating build/lib.linux-x86_64-cpython-311/tokenizers/normalizers
copying py_src/tokenizers/normalizers/__init__.py -> build/lib.linux-x86_64-cpython-311/tokenizers/normalizers
creating build/lib.linux-x86_64-cpython-311/tokenizers/pre_tokenizers
copying py_src/tokenizers/pre_tokenizers/__init__.py -> build/lib.linux-x86_64-cpython-311/tokenizers/pre_tokenizers
creating build/lib.linux-x86_64-cpython-311/tokenizers/processors
copying py_src/tokenizers/processors/__init__.py -> build/lib.linux-x86_64-cpython-311/tokenizers/processors
creating build/lib.linux-x86_64-cpython-311/tokenizers/trainers
copying py_src/tokenizers/trainers/__init__.py -> build/lib.linux-x86_64-cpython-311/tokenizers/trainers
creating build/lib.linux-x86_64-cpython-311/tokenizers/implementations
copying py_src/tokenizers/implementations/__init__.py -> build/lib.linux-x86_64-cpython-311/tokenizers/implementations
copying py_src/tokenizers/implementations/base_tokenizer.py -> build/lib.linux-x86_64-cpython-311/tokenizers/implementations
copying py_src/tokenizers/implementations/bert_wordpiece.py -> build/lib.linux-x86_64-cpython-311/tokenizers/implementations
copying py_src/tokenizers/implementations/byte_level_bpe.py -> build/lib.linux-x86_64-cpython-311/tokenizers/implementations
copying py_src/tokenizers/implementations/char_level_bpe.py -> build/lib.linux-x86_64-cpython-311/tokenizers/implementations
copying py_src/tokenizers/implementations/sentencepiece_bpe.py -> build/lib.linux-x86_64-cpython-311/tokenizers/implementations
copying py_src/tokenizers/implementations/sentencepiece_unigram.py -> build/lib.linux-x86_64-cpython-311/tokenizers/implementations
creating build/lib.linux-x86_64-cpython-311/tokenizers/tools
copying py_src/tokenizers/tools/__init__.py -> build/lib.linux-x86_64-cpython-311/tokenizers/tools
copying py_src/tokenizers/tools/visualizer.py -> build/lib.linux-x86_64-cpython-311/tokenizers/tools
copying py_src/tokenizers/__init__.pyi -> build/lib.linux-x86_64-cpython-311/tokenizers
copying py_src/tokenizers/models/__init__.pyi -> build/lib.linux-x86_64-cpython-311/tokenizers/models
copying py_src/tokenizers/decoders/__init__.pyi -> build/lib.linux-x86_64-cpython-311/tokenizers/decoders
copying py_src/tokenizers/normalizers/__init__.pyi -> build/lib.linux-x86_64-cpython-311/tokenizers/normalizers
copying py_src/tokenizers/pre_tokenizers/__init__.pyi -> build/lib.linux-x86_64-cpython-311/tokenizers/pre_tokenizers
copying py_src/tokenizers/processors/__init__.pyi -> build/lib.linux-x86_64-cpython-311/tokenizers/processors
copying py_src/tokenizers/trainers/__init__.pyi -> build/lib.linux-x86_64-cpython-311/tokenizers/trainers
copying py_src/tokenizers/tools/visualizer-styles.css -> build/lib.linux-x86_64-cpython-311/tokenizers/tools
running build_ext
running build_rust
error: can't find Rust compiler
If you are using an outdated pip version, it is possible a prebuilt wheel is available for this package but pip is not able to install from it. Installing from the wheel would avoid the need for a Rust compiler.
To update pip, run:
pip install --upgrade pip
and then retry package installation.
If you did intend to build this package from source, try installing a Rust compiler from your system package manager and ensure it is on the PATH during installation. Alternatively, rustup (available at https://rustup.rs) is the recommended way to download and update the Rust compiler toolchain.
[end of output]
note: This error originates from a subprocess, and is likely not a problem with pip.
ERROR: Failed building wheel for tokenizers
Failed to build tokenizers
ERROR: Could not build wheels for tokenizers, which is required to install pyproject.toml-based projects
the tokenizer is not huggingface tokenizer
先关闭,如有问题重新打开issue。谢谢