flux
flux copied to clipboard
CUDA import Error while running `demo_gr.py`
Description of Issue
I am trying to perform Inference using the Flux.1 Schnell model by running demo_gr.py, but I am receiving a cuda import error, although torch.cuda.is_available() is returning True.
I am using Python Version 3.10 and CUDA Version 11.8, and running it on a remote instance (Ubuntu based), so please guide me what to do?
Steps to Reproduce Error
- Environment has been setup using the following commands:
git clone https://github.com/black-forest-labs/flux
conda create -n flux python=3.10
conda activate flux
cd flux
pip install -e ".[all]"
- Inference script has been run using the following command:
python demo_gr.py --name flux-schnell --share
Complete Error Output
Traceback (most recent call last):
File "/home/flux/demo_gr.py", line 12, in <module>
from flux.cli import SamplingOptions
File "/home/flux/src/flux/cli.py", line 8, in <module>
from cuda import cudart
ModuleNotFoundError: No module named 'cuda'
Update
I ran pip install cuda-python and now it's giving error ModuleNotFoundError: No module named 'tensorrt'
Please guide me what to do, and confirm if I need to go with TensorRT setup method,, instead of the first one, to run gradio app? Also mention what are the GPU requirements for running the inference script of FLUX.1 Schnell. Thanks!
You can try running pip install -e ".[tensorrt]". Seems to work without doing all the enroot stuff mentioned in the README