LM-VTON
LM-VTON copied to clipboard
transforms.ToTensor () Problems
Hi @lgqfhwy, Thanks for your impressive work. And I want to re-train the model by myself.
However, when I just follow the environments.yml
to set the same settings as you. It shows this kind of error:
img = torch.from_numpy(pic.transpose((2, 0, 1)))
ValueError: axes don't match array
This problem is located in the xxx_dataset.py
file when giving a transform to densepose_shape_array to make it a tensor. The same error also to the case of parse_array
Then I follow some suggestions, to change the code in the xxx_dataset.py
file :
from:
densepose_shape_tensor = self.transform_one(densepose_shape_array)
to
densepose_shape_tensor = torch.Tensor(densepose_shape_array/255.0)
densepose_shape_tensor = densepose_shape_tensor.unsqueeze(0)
Then the training procedure can start. However, with the increase of training time. There comes another problem in the following (The scale of the generated warped cloth seems strange):
And when I finish the training and give a test with the latest checkpoint, I find the same problem to all warped target cloth:
They are all with mismatched scales and a strange orientation.
Hence, I come here for help, could you kindly share with me some lights on how to tackle these nuts?
Best regards and many thanks,
@Amazingren Sorry for replying late. It seems that you have environment problem. You don't need to change the code, you could first set your emvironment same with cp-vton (https://github.com/sergeywong/cp-vton) or ACGPN (https://github.com/minar09/ACGPN), the code in the dataset is all same. Hope to help you.