LLaMA-Factory
LLaMA-Factory copied to clipboard
TF-TRT Warning: Could not find TensorRT
Reminder
- [X] I have read the README and searched the existing issues.
Reproduction
root@dsw-60082-56c79f4648-hdxv7:/mnt/workspace/LLaMA-Factory# CUDA_VISIBLE_DEVICES=0 USE_MODELSCOPE_HUB=1 python src/webui.py 2024-05-10 11:33:14.068255: I tensorflow/core/platform/cpu_feature_guard.cc:210] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations. To enable the following instructions: AVX2 AVX512F FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags. 2024-05-10 11:33:15.185131: W tensorflow/compiler/tf2tensorrt/utils/py_utils.cc:38] TF-TRT Warning: Could not find TensorRT [2024-05-10 11:33:19,738] [INFO] [real_accelerator.py:161:get_accelerator] Setting ds_accelerator to cuda (auto detect) Running on local URL: http://0.0.0.0:7860
To create a public link, set share=True
in launch()
.
Expected behavior
How not to report this error?
System Info
No response
Others
No response
Just ignore it.
thank you~