LTX-Video
LTX-Video copied to clipboard
Bug - Setting "offload_to_cpu" to True causes Error
If offload_to_cpu parameter is TRUE and torch.cuda.is_available()==false, the script throws an error msg:
"TypeError '<' not supported between instances of 'NoneType' and 'int'
The error is caused by line 282 in inference.py:
offload_to_cpu = False if not args.offload_to_cpu else get_total_gpu_memory() < 30
To fix this, get_total_gpu_memory() should probably return 0 instead of None if CUDA is not available.
Thanks for reporting this issue. We will fix this.