SimSwap icon indicating copy to clipboard operation
SimSwap copied to clipboard

how to use trained model ? i use my dataset after traing there are 4 '.pth' files {100000_net_D.pth 100000_net_G.pth 100000_optim_D.pth 100000_optim_G.pth} how can I you use these model files

Open doctorcui opened this issue 3 years ago • 19 comments

doctorcui avatar Apr 26 '22 05:04 doctorcui

i put new trained pth files in ./checkpoints/simswap512 when i use python test_video_swapsingle.py --crop_size 224 --use_mask --name simswap512 --Arc_path arcface_model/arcface_checkpoint.tar --pic_a_path ./demo_file/Iron_man.jpg --video_path ./demo_file/multi_people_1080p.mp4 --output_path ./output/multi_test_swapsingle.mp4 --temp_path ./temp_results` BUT it does not work

doctorcui avatar Apr 26 '22 05:04 doctorcui

What size you train - 512 or 224? If it was 224 - try add --which_epoch. In your case: python test_video_swapsingle.py --crop_size 224 --use_mask --which_epoch 100000 --name simswap512 --Arc_path arcface_model/arcface_checkpoint.tar --pic_a_path ./demo_file/Iron_man.jpg --video_path ./demo_file/multi_people_1080p.mp4 --output_path ./output/multi_test_swapsingle.mp4 --temp_path ./temp_results

netrunner-exe avatar Apr 26 '22 07:04 netrunner-exe

If you train 512 - try to comment or remove lines 49 and 50 in test_video_swapsingle.py. Then everything is the same as in the example above, except --crop_size - change 224 to 512

netrunner-exe avatar Apr 26 '22 07:04 netrunner-exe

What dataset and GPU you used for training?

netrunner-exe avatar Apr 26 '22 07:04 netrunner-exe

thank you for you answer
another error appears []Pretrained network G has fewer layers; The following are not initialized: ['down0', 'first_layer', 'last_layer', 'up0']

doctorcui avatar Apr 26 '22 10:04 doctorcui

If you train 512 - try to comment or remove lines 49 and 50 in test_video_swapsingle.py. Then everything is the same as in the example above, except --crop_size - change 224 to 512

i download 8000 pictures on internet and Divide them into 80 groups of 100 images each and follow {Generate the HQ dataset by yourself. (If you want to do so)} to make my dataset

doctorcui avatar Apr 26 '22 10:04 doctorcui

thank you for you answer another error appears []Pretrained network G has fewer layers; The following are not initialized: ['down0', 'first_layer', 'last_layer', 'up0']

Try #246 and --crop_size exactly what you train

netrunner-exe avatar Apr 26 '22 10:04 netrunner-exe

If you train 512 - try to comment or remove lines 49 and 50 in test_video_swapsingle.py. Then everything is the same as in the example above, except --crop_size - change 224 to 512

i download 8000 pictures on internet and Divide them into 80 groups of 100 images each and follow {Generate the HQ dataset by yourself. (If you want to do so)} to make my dataset

8000 is very little for training. vgg2 cropped and aligned dataset contain around 600000 images and recommended to train about 400k-600k it

netrunner-exe avatar Apr 26 '22 11:04 netrunner-exe

If you train 512 - try to comment or remove lines 49 and 50 in test_video_swapsingle.py. Then everything is the same as in the example above, except --crop_size - change 224 to 512

i download 8000 pictures on internet and Divide them into 80 groups of 100 images each and follow {Generate the HQ dataset by yourself. (If you want to do so)} to make my dataset

8000 is very little for training. vgg2 cropped and aligned dataset contain around 600000 images and recommended to train about 400k-600k it

tkank for your advive i us e 512*512 dataset to train but when a use -crop_size 512 to test error occoues [Pretrained network G has fewer layers; The following are not initialized: ['down0', 'first_layer', 'last_layer', 'up0]

doctorcui avatar Apr 26 '22 11:04 doctorcui

If you train 512 - try to comment or remove lines 49 and 50 in test_video_swapsingle.py. Then everything is the same as in the example above, except --crop_size - change 224 to 512

i download 8000 pictures on internet and Divide them into 80 groups of 100 images each and follow {Generate the HQ dataset by yourself. (If you want to do so)} to make my dataset

8000 is very little for training. vgg2 cropped and aligned dataset contain around 600000 images and recommended to train about 400k-600k it

tkank for your advive i us e 512*512 dataset to train but when a use -crop_size 512 to test error occoues [Pretrained network G has fewer layers; The following are not initialized: ['down0', 'first_layer', 'last_layer', 'up0]

it's strange that when i use -- cropsize 224 that error not appears but it's result really bad

doctorcui avatar Apr 26 '22 11:04 doctorcui

You train 512 or 224? Not a dataset, --crop_size of your command that you use for training

netrunner-exe avatar Apr 26 '22 11:04 netrunner-exe

python train.py --name simswap512_test --batchSize 16 --gpu_ids 0 --dataset /path/to/VGGFace2HQ --Gdeep True -------》train.py dont have crop_size 512 224 parameter

doctorcui avatar Apr 26 '22 13:04 doctorcui

You train 512 or 224? Not a dataset, --crop_size of your command that you use for training

from 224 and 512 the ONLY difference is --Gdeep True or FALSE

doctorcui avatar Apr 26 '22 13:04 doctorcui

You train 512 or 224? Not a dataset, --crop_size of your command that you use for training

from 224 and 512 the ONLY difference is --Gdeep True or FALSE

Honestly, I don't understand what you mean. @neuralchen wrote the most understandable instructions - if you train 224 - use command 1 and a dataset that is cropped to 224x224, if 512 - command 2 and a dataset that is cropped to 512x512. The error that you get if you trained incorrectly, or incorrectly used the option -- crop_size - I mean it occurs if you trained 224 and put --crop_size 512 in inference and vice versa. At least that's how it appeared to me.

netrunner-exe avatar Apr 26 '22 13:04 netrunner-exe

thank you again for your advices ! and yes i know what you mean but i use 512*512 dataset , follow instructions command 2 to train when i test model i set crop-size=512 and error still happens
and Did you successfully complete the 512 training?

doctorcui avatar Apr 26 '22 15:04 doctorcui

You train 512 or 224? Not a dataset, --crop_size of your command that you use for training

from 224 and 512 the ONLY difference is --Gdeep True or FALSE

Honestly, I don't understand what you mean. @neuralchen wrote the most understandable instructions - if you train 224 - use command 1 and a dataset that is cropped to 224x224, if 512 - command 2 and a dataset that is cropped to 512x512. The error that you get if you trained incorrectly, or incorrectly used the option -- crop_size - I mean it occurs if you trained 224 and put --crop_size 512 in inference and vice versa. At least that's how it appeared to me.

its stange that @neuralchen give two train command and it' s difference is only {Gdeep} when traing G-model use fs_networks_fix.py when i test, G-model use fs_networks_512.py to initial model it's wrong because they are different nn.Sequential levels
we should use fs_networks.py to initial trained model what we only need change is set -->{deep} True or False

doctorcui avatar Apr 27 '22 03:04 doctorcui

You train 512 or 224? Not a dataset, --crop_size of your command that you use for training

from 224 and 512 the ONLY difference is --Gdeep True or FALSE

Honestly, I don't understand what you mean. @neuralchen wrote the most understandable instructions - if you train 224 - use command 1 and a dataset that is cropped to 224x224, if 512 - command 2 and a dataset that is cropped to 512x512. The error that you get if you trained incorrectly, or incorrectly used the option -- crop_size - I mean it occurs if you trained 224 and put --crop_size 512 in inference and vice versa. At least that's how it appeared to me.

its stange that @neuralchen give two train command and it' s difference is only {Gdeep} when traing G-model use fs_networks_fix.py when i test, G-model use fs_networks_512.py to initial model it's wrong because they are different nn.Sequential levels we should use fs_networks.py to initial trained model what we only need change is set -->{deep} True or False

Perhaps this is necessary in order to correctly train the 512 model so that the result will be better than in the previously published beta 512

netrunner-exe avatar Apr 27 '22 05:04 netrunner-exe

You train 512 or 224? Not a dataset, --crop_size of your command that you use for training

from 224 and 512 the ONLY difference is --Gdeep True or FALSE

Honestly, I don't understand what you mean. @neuralchen wrote the most understandable instructions - if you train 224 - use command 1 and a dataset that is cropped to 224x224, if 512 - command 2 and a dataset that is cropped to 512x512. The error that you get if you trained incorrectly, or incorrectly used the option -- crop_size - I mean it occurs if you trained 224 and put --crop_size 512 in inference and vice versa. At least that's how it appeared to me.

its stange that @neuralchen give two train command and it' s difference is only {Gdeep} when traing G-model use fs_networks_fix.py when i test, G-model use fs_networks_512.py to initial model it's wrong because they are different nn.Sequential levels we should use fs_networks.py to initial trained model what we only need change is set -->{deep} True or False

Perhaps this is necessary in order to correctly train the 512 model so that the result will be better than in the previously published beta 512

maybe , so if follow commend 2 to train 512 G-model when you run test use --cropsize 512 will break out code model error so i tried --224 and set fs_model.py line 59 [deep=True] finaly it works (ps: im vrey sure i use train commend 2 {python train.py --name simswap512_test --batchSize 16 --gpu_ids 0 --dataset /path/to/VGGFace2HQ --Gdeep True } i trained 512 model but 224)

doctorcui avatar Apr 27 '22 12:04 doctorcui

--Gdeep True or FALSE the option is designed to optionally add one downscaling layer and a upsampling layer. This design is to increase the receptive field of the backbone when processing the large size image, e.g., 512.

neuralchen avatar May 03 '22 14:05 neuralchen