tts-generation-webui
tts-generation-webui copied to clipboard
xformers | Triton is not available, some optimizations will not be enabled.
2023-09-19 10:54:00 | WARNING | xformers | WARNING[XFORMERS]: xFormers can't load C++/CUDA extensions. xFormers was built for: PyTorch 2.0.0+cu118 with CUDA 1108 (you have 2.0.0+cpu) Python 3.10.11 (you have 3.10.13) Please reinstall xformers (see https://github.com/facebookresearch/xformers#installing-xformers)
Memory-efficient attention, SwiGLU, sparse and more won't be available.
Set XFORMERS_MORE_DETAILS=1 for more details
2023-09-19 10:54:01 | WARNING | xformers | Triton is not available, some optimizations will not be enabled.
This is just a warning: No module named 'triton'
(base) D:\AITech\tts-generation-webui>conda activate ttsgen
(ttsgen) D:\AITech\tts-generation-webui>pip list
Package Version
------------------------- ------------
absl-py 1.4.0
accelerate 0.23.0
aiofiles 23.2.1
aiohttp 3.8.5
aiosignal 1.3.1
altair 5.1.1
annotated-types 0.5.0
antlr4-python3-runtime 4.8
anyio 3.7.1
async-timeout 4.0.3
attrs 23.1.0
audiocraft 1.0.0
audiolm-pytorch 1.2.28
audioread 3.0.0
av 10.0.0
bark-hubert-quantizer 0.0.5
beartype 0.15.0
bitarray 2.8.1
blis 0.7.10
boto3 1.28.49
botocore 1.31.49
cachetools 5.3.1
catalogue 2.0.9
certifi 2023.7.22
cffi 1.15.1
charset-normalizer 3.2.0
click 8.1.7
cloudpickle 2.2.1
colorama 0.4.6
colorlog 6.7.0
confection 0.1.3
contourpy 1.1.1
cycler 0.11.0
cymem 2.0.8
Cython 0.29.36
decorator 5.1.1
demucs 4.0.1
docopt 0.6.2
dora-search 0.1.12
einops 0.6.1
ema-pytorch 0.2.3
encodec 0.1.1
exceptiongroup 1.1.3
fairseq 0.12.4
faiss-cpu 1.7.4
fastapi 0.103.1
ffmpeg-python 0.2.0
ffmpy 0.3.1
filelock 3.12.4
flashy 0.0.2
fonttools 4.42.1
frozenlist 1.4.0
fsspec 2023.9.1
functorch 2.0.0
funcy 2.0
future 0.18.3
google-auth 2.23.0
google-auth-oauthlib 1.0.0
gradio 3.35.2
gradio_client 0.5.0
grpcio 1.58.0
h11 0.14.0
httpcore 0.18.0
httpx 0.25.0
huggingface-hub 0.17.1
hydra-colorlog 1.2.0
hydra-core 1.0.7
idna 3.4
inflect 7.0.0
Jinja2 3.1.2
jmespath 1.0.1
joblib 1.3.2
json5 0.9.14
jsonschema 4.19.0
jsonschema-specifications 2023.7.1
julius 0.2.7
kiwisolver 1.4.5
lameenc 1.6.1
langcodes 3.3.0
lazy_loader 0.3
librosa 0.9.2
lightning-utilities 0.9.0
linkify-it-py 2.0.2
lion-pytorch 0.1.2
llvmlite 0.39.0
local-attention 1.8.6
lxml 4.9.3
Markdown 3.4.4
markdown-it-py 2.2.0
MarkupSafe 2.1.3
matplotlib 3.8.0
matplotlib-inline 0.1.6
mdit-py-plugins 0.3.3
mdurl 0.1.2
mpmath 1.3.0
msgpack 1.0.5
multidict 6.0.4
murmurhash 1.0.10
mypy-extensions 1.0.0
networkx 3.1
num2words 0.5.12
numba 0.56.4
numpy 1.23.5
oauthlib 3.2.2
omegaconf 2.0.6
openunmix 1.2.1
orjson 3.9.7
packaging 23.1
pandas 2.1.0
pathy 0.10.2
Pillow 9.3.0
pip 23.2.1
platformdirs 3.10.0
pooch 1.7.0
portalocker 2.8.2
praat-parselmouth 0.4.3
preshed 3.0.9
progressbar 2.5
protobuf 4.24.3
psutil 5.9.5
pyasn1 0.4.8
pyasn1-modules 0.2.8
pycparser 2.21
pydantic 1.10.12
pydantic_core 2.6.3
pydub 0.25.1
Pygments 2.16.1
pyparsing 3.1.1
pyre-extensions 0.0.29
python-dateutil 2.8.2
python-dotenv 1.0.0
python-multipart 0.0.6
pytz 2023.3.post1
pywin32 306
pyworld 0.3.4
PyYAML 6.0.1
referencing 0.30.2
regex 2023.8.8
requests 2.31.0
requests-oauthlib 1.3.1
resampy 0.4.2
retrying 1.3.4
rotary-embedding-torch 0.3.0
rpds-py 0.10.3
rsa 4.9
rvc-beta 0.1.1
s3transfer 0.6.2
sacrebleu 2.3.1
safetensors 0.3.1
scikit-learn 1.3.0
scipy 1.9.3
semantic-version 2.10.0
sentencepiece 0.1.99
setuptools 68.0.0
six 1.16.0
smart-open 6.4.0
sniffio 1.3.0
soundfile 0.12.1
soxr 0.3.6
spacy 3.5.2
spacy-legacy 3.0.12
spacy-loggers 1.0.5
srsly 2.4.7
starlette 0.27.0
submitit 1.4.5
suno-bark 0.1.0
sympy 1.12
tabulate 0.9.0
tensorboard 2.14.0
tensorboard-data-server 0.7.1
tensorboard-plugin-wit 1.8.1
tensorboardX 2.6.2.2
thinc 8.1.12
threadpoolctl 3.2.0
tokenizers 0.13.3
toolz 0.12.0
torch 2.0.0
torchaudio 2.0.1
torchcrepe 0.0.20
torchgen 0.0.1
torchmetrics 1.1.2
tornado 6.3.3
TorToiSe 2.8.0
tqdm 4.66.1
traitlets 5.10.0
transformers 4.31.0
treetable 0.2.5
typer 0.7.0
typing_extensions 4.8.0
typing-inspect 0.9.0
tzdata 2023.3
uc-micro-py 1.0.2
Unidecode 1.3.6
urllib3 1.26.16
uvicorn 0.21.1
vector-quantize-pytorch 1.7.1
vocos 0.0.2
wasabi 1.1.2
websockets 11.0.3
Werkzeug 2.3.7
wheel 0.38.4
xformers 0.0.19
yarl 1.9.2
(ttsgen) D:\AITech\tts-generation-webui>python server.py
Loading extensions:
Loaded extension: callback_save_generation_ffmpeg
Loaded extension: callback_save_generation_musicgen_ffmpeg
Loaded extension: empty_extension
Loaded 2 callback_save_generation extensions.
Loaded 1 callback_save_generation_musicgen extensions.
2023-09-19 10:54:00 | WARNING | xformers | WARNING[XFORMERS]: xFormers can't load C++/CUDA extensions. xFormers was built for:
PyTorch 2.0.0+cu118 with CUDA 1108 (you have 2.0.0+cpu)
Python 3.10.11 (you have 3.10.13)
Please reinstall xformers (see https://github.com/facebookresearch/xformers#installing-xformers)
Memory-efficient attention, SwiGLU, sparse and more won't be available.
Set XFORMERS_MORE_DETAILS=1 for more details
2023-09-19 10:54:01 | WARNING | xformers | Triton is not available, some optimizations will not be enabled.
This is just a warning: No module named 'triton'
Starting Gradio server...
Gradio interface options:
inline: False
inbrowser: True
share: False
debug: False
enable_queue: True
max_threads: 40
auth: None
auth_message: None
prevent_thread_lock: False
show_error: False
server_name: 0.0.0.0
server_port: None
show_tips: False
height: 500
width: 100%
favicon_path: None
ssl_keyfile: None
ssl_certfile: None
ssl_keyfile_password: None
ssl_verify: True
quiet: True
show_api: True
file_directories: None
_frontend: True
Running on local URL: http://0.0.0.0:7860
You can still run the project even with xformers disabled.
However, I see that you have Pytorch for CPU. I highly recommend installing torch via conda, as it is done in my one click installers. Also installing ffmpeg and python version 3.10.11.
- [ ] > You can still run the project even with xformers disabled.
However, I see that you have Pytorch for CPU. I highly recommend installing torch via conda, as it is done in my one click installers. Also installing ffmpeg and python version 3.10.11.
My gpu is nvidia 4070ti, and I don't know why torch for cpu installed. I'll try to uninstall and reinstall the torch.
You can check https://github.com/rsxdalv/one-click-installers-tts for the conda setup that is recommended/verified to work.
You can check https://github.com/rsxdalv/one-click-installers-tts for the conda setup that is recommended/verified to work.
Thank you!
I reinstalled python3.10.11 and torch for cu118, but the WARNING still exists.
Loading extensions:
Loaded extension: callback_save_generation_ffmpeg
Loaded extension: callback_save_generation_musicgen_ffmpeg
Loaded extension: empty_extension
Loaded 2 callback_save_generation extensions.
Loaded 1 callback_save_generation_musicgen extensions.
2023-09-20 13:40:50 | WARNING | xformers | A matching Triton is not available, some optimizations will not be enabled.
Error caught was: No module named 'triton'
2023-09-20 13:40:50 | WARNING | xformers | Triton is not available, some optimizations will not be enabled.
This is just a warning: No module named 'triton'
Starting Gradio server...
Gradio interface options:
inline: False
inbrowser: True
share: False
debug: False
enable_queue: True
max_threads: 40
auth: None
auth_message: None
prevent_thread_lock: False
show_error: False
server_name: 0.0.0.0
server_port: None
show_tips: False
height: 500
width: 100%
favicon_path: None
ssl_keyfile: None
ssl_certfile: None
ssl_keyfile_password: None
ssl_verify: True
quiet: True
show_api: True
file_directories: None
_frontend: True
Running on local URL: http://0.0.0.0:7860
I reinstalled python3.10.11 and torch for cu118, but the WARNING still exists.
Loading extensions: Loaded extension: callback_save_generation_ffmpeg Loaded extension: callback_save_generation_musicgen_ffmpeg Loaded extension: empty_extension Loaded 2 callback_save_generation extensions. Loaded 1 callback_save_generation_musicgen extensions. 2023-09-20 13:40:50 | WARNING | xformers | A matching Triton is not available, some optimizations will not be enabled. Error caught was: No module named 'triton' 2023-09-20 13:40:50 | WARNING | xformers | Triton is not available, some optimizations will not be enabled. This is just a warning: No module named 'triton' Starting Gradio server... Gradio interface options: inline: False inbrowser: True share: False debug: False enable_queue: True max_threads: 40 auth: None auth_message: None prevent_thread_lock: False show_error: False server_name: 0.0.0.0 server_port: None show_tips: False height: 500 width: 100% favicon_path: None ssl_keyfile: None ssl_certfile: None ssl_keyfile_password: None ssl_verify: True quiet: True show_api: True file_directories: None _frontend: True Running on local URL: http://0.0.0.0:7860
Ok, this is much better. As for triton, there are ways to install it but I never found it necessary, the warning is just there. I would say you can use the UI now.
I reinstalled python3.10.11 and torch for cu118, but the WARNING still exists.
Loading extensions:
Loaded extension: callback_save_generation_ffmpeg
Loaded extension: callback_save_generation_musicgen_ffmpeg
Loaded extension: empty_extension
Loaded 2 callback_save_generation extensions.
Loaded 1 callback_save_generation_musicgen extensions.
2023-09-20 13:40:50 | WARNING | xformers | A matching Triton is not available, some optimizations will not be enabled.
Error caught was: No module named 'triton'
2023-09-20 13:40:50 | WARNING | xformers | Triton is not available, some optimizations will not be enabled.
This is just a warning: No module named 'triton'
Starting Gradio server...
Gradio interface options:
inline: False
inbrowser: True
share: False
debug: False
enable_queue: True
max_threads: 40
auth: None
auth_message: None
prevent_thread_lock: False
show_error: False
server_name: 0.0.0.0
server_port: None
show_tips: False
height: 500
width: 100%
favicon_path: None
ssl_keyfile: None
ssl_certfile: None
ssl_keyfile_password: None
ssl_verify: True
quiet: True
show_api: True
file_directories: None
_frontend: True
Running on local URL: http://0.0.0.0:7860
Ok, this is much better. As for triton, there are ways to install it but I never found it necessary, the warning is just there.
I would say you can use the UI now.
Thank you!
or triton, there are ways to install it but I never found it necessary, the warning is just there. I would say you can use the UI
May I ask you how you replaced these packages in conda, I used the command conda install pytorch==2.0.0 torchvision==0.15.0 torchaudio==2.0.0 pytorch-cuda=11.8 -c pytorch -c nvidia
but it only shows errors about some conflicts "ClobberError: This transaction has incompatible packages due to a shared path.".
or triton, there are ways to install it but I never found it necessary, the warning is just there.
I would say you can use the UI
May I ask you how you replaced these packages in conda, I used the command
conda install pytorch==2.0.0 torchvision==0.15.0 torchaudio==2.0.0 pytorch-cuda=11.8 -c pytorch -c nvidia
but it only shows errors about some conflicts "ClobberError: This transaction has incompatible packages due to a shared path.".
git clone https://github.com/rsxdalv/tts-generation-webui.git
conda create -n ttsgen python=3.10.11
conda activate ttsgen
(ttsgen) D:\AITech\tts-generation-webui> pip3 install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cu118
(ttsgen) D:\AITech\tts-generation-webui> pip install -r requirements.txt
(ttsgen) D:\AITech\tts-generation-webui> python server.py
You need to install Microsoft C++ Build Tools in advance and add ffmpeg to the system variable patch path. In fact, the author of this repository has explained the installation steps completely in readme.
ip3 install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cu118 (ttsgen) D:\AITech\tts-generation-webui> pip install -r requirements.txt (ttsgen) D:\AITech\tts-generation-webui> python server.py
Thank you for your help, the thing is, I used the automatic installation (using start_windows.bat), it installed everything, but used the cpu version apparently even though I specified that I have the Nvidia card, I think that's what happened in your case too, so now I ended up with the same issue you described in your initial post, and i had to reinstall the packages for the cuda version. looks like I'll have to redo everything manually then, thanks.
or triton, there are ways to install it but I never found it necessary, the warning is just there. I would say you can use the UI
May I ask you how you replaced these packages in conda, I used the command
conda install pytorch==2.0.0 torchvision==0.15.0 torchaudio==2.0.0 pytorch-cuda=11.8 -c pytorch -c nvidia
but it only shows errors about some conflicts "ClobberError: This transaction has incompatible packages due to a shared path.".
That's a new error for me, are there any similar reports on the net about it?
Searching on the net makes this error seem to be conda related. Either something is wrong with the commands or the setup has to be changed for some users due to different software.
On Sat, Jan 6, 2024, 1:06 AM Yassir KHALDI @.***> wrote:
ip3 install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cu118 (ttsgen) D:\AITech\tts-generation-webui> pip install -r requirements.txt (ttsgen) D:\AITech\tts-generation-webui> python server.py
Thank you for your help, the thing is, I used the automatic installation (using start_windows.bat), it installed everything, but used the cpu version apparently even though I specified that I have the Nvidia card, I think that's what happened in your case too, so now I ended up with the same issue you described in your initial post, and i had to reinstall the packages for the cuda version. looks like I'll have to redo everything manually then, thanks.
— Reply to this email directly, view it on GitHub https://github.com/rsxdalv/tts-generation-webui/issues/180#issuecomment-1879363150, or unsubscribe https://github.com/notifications/unsubscribe-auth/ABTRXIYT4SSTPFLZDYVYWA3YNCBQJAVCNFSM6AAAAAA45TSZG2VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTQNZZGM3DGMJVGA . You are receiving this because you commented.Message ID: @.***>