taming-transformers
taming-transformers copied to clipboard
Not finding GPUs
I keep getting this error in WSL, even though I do have a GPU:
Working with z of shape (1, 256, 16, 16) = 65536 dimensions.
loaded pretrained LPIPS loss from taming/modules/autoencoder/lpips/vgg.pth
VQLPIPSWithDiscriminator running with hinge loss.
/usr/local/lib/python3.8/dist-packages/pytorch_lightning/utilities/distributed.py:45: UserWarning: ModelCheckpoint(save_last=True, monitor=None) is a redundant configuration. You can save the last checkpoint with ModelCheckpoint(save_top_k=None, monitor=None).
warnings.warn(*args, **kwargs)
Traceback (most recent call last):
File "main.py", line 527, in <module>
trainer = Trainer.from_argparse_args(trainer_opt, **trainer_kwargs)
File "/usr/local/lib/python3.8/dist-packages/pytorch_lightning/trainer/properties.py", line 124, in from_argparse_args
return argparse_utils.from_argparse_args(cls, args, **kwargs)
File "/usr/local/lib/python3.8/dist-packages/pytorch_lightning/utilities/argparse_utils.py", line 50, in from_argparse_args
return cls(**trainer_kwargs)
File "/usr/local/lib/python3.8/dist-packages/pytorch_lightning/trainer/connectors/env_vars_connector.py", line 41, in overwrite_by_env_vars
return fn(self, **kwargs)
File "/usr/local/lib/python3.8/dist-packages/pytorch_lightning/trainer/trainer.py", line 333, in __init__
self.accelerator_connector.on_trainer_init(
File "/usr/local/lib/python3.8/dist-packages/pytorch_lightning/accelerators/accelerator_connector.py", line 111, in on_trainer_init
self.trainer.data_parallel_device_ids = device_parser.parse_gpu_ids(self.trainer.gpus)
File "/usr/local/lib/python3.8/dist-packages/pytorch_lightning/utilities/device_parser.py", line 76, in parse_gpu_ids
gpus = _sanitize_gpu_ids(gpus)
File "/usr/local/lib/python3.8/dist-packages/pytorch_lightning/utilities/device_parser.py", line 134, in _sanitize_gpu_ids
raise MisconfigurationException(f"""
pytorch_lightning.utilities.exceptions.MisconfigurationException:
You requested GPUs: [0]
But your machine only has: []```
I met the same problem
You can run the parameter "--gpus 0" instead of "--gpus 0,"