LTX-Video icon indicating copy to clipboard operation
LTX-Video copied to clipboard

Bug - Setting "offload_to_cpu" to True causes Error

Open Angel996 opened this issue 11 months ago • 1 comments

If offload_to_cpu parameter is TRUE and torch.cuda.is_available()==false, the script throws an error msg:

"TypeError '<' not supported between instances of 'NoneType' and 'int'

The error is caused by line 282 in inference.py:

offload_to_cpu = False if not args.offload_to_cpu else get_total_gpu_memory() < 30

To fix this, get_total_gpu_memory() should probably return 0 instead of None if CUDA is not available.

Angel996 avatar Jan 18 '25 02:01 Angel996

Thanks for reporting this issue. We will fix this.

yoavhacohen avatar Jan 21 '25 13:01 yoavhacohen