ganimation_replicate icon indicating copy to clipboard operation
ganimation_replicate copied to clipboard

My own data test is very poor

Open pengweixiang opened this issue 6 years ago • 27 comments

I use my own data for testing. The results show the following strange situation, I would like to ask what is the situation? 1_frame30290face0_1_frame30268face0 2_frame163116face0_2_frame166662face0

pengweixiang avatar May 16 '19 10:05 pengweixiang

Hi @pengweixiang , did you load the pretrained weights provided by this project? If yes, how did you crop the face and extract AU vectors? As mentioned here, this project uses face_recognition to extract face bounding box and Openface to obtain AU vectors. Kindly follow the same settings if you want to test on your own dataset.

donydchen avatar May 16 '19 11:05 donydchen

Hi @pengweixiang , did you load the pretrained weights provided by this project? If yes, how did you crop the face and extract AU vectors? As mentioned here, this project uses face_recognition to extract face bounding box and Openface to obtain AU vectors. Kindly follow the same settings if you want to test on your own dataset.

Except not used face_recognition to extract face bounding box. Can you trouble this part of the code for my reference, thank you very much!

pengweixiang avatar May 16 '19 11:05 pengweixiang

@pengweixiang , the main function goes as below. Note that you need to install face_recognition package first.

import face_recognition
from PIL import Image

def crop_face(img_path, size=(128, 128)):
    face_im = face_recognition.load_image_file(img_path)
    bboxs = face_recognition.face_locations(face_im)

    im = None
    if len(bboxs) > 0:
        im = Image.fromarray(face_im)
        bbox = bboxs[0]
        im = im.crop((bbox[3], bbox[0], bbox[1], bbox[2]))
        im.thumbnail(size, Image.ANTIALIAS)

    return im

donydchen avatar May 16 '19 11:05 donydchen

@pengweixiang , the main function goes as below. Note that you need to install face_recognition package first.

import face_recognition
from PIL import Image

def crop_face(img_path, size=(128, 128)):
    face_im = face_recognition.load_image_file(img_path)
    bboxs = face_recognition.face_locations(face_im)

    im = None
    if len(bboxs) > 0:
        im = Image.fromarray(face_im)
        bbox = bboxs[0]
        im = im.crop((bbox[3], bbox[0], bbox[1], bbox[2]))
        im.thumbnail(size, Image.ANTIALIAS)

    return im

Thank you very much for your answer, I will try first.

pengweixiang avatar May 16 '19 11:05 pengweixiang

Did not work, the problem still exists. very strange. extract N_0000000376_N_0000000283

pengweixiang avatar May 16 '19 12:05 pengweixiang

Maybe you can try to train the model with your own dataset?

donydchen avatar May 16 '19 12:05 donydchen

Maybe you can try to train the model with your own dataset?

I suspect that it is an openface issue. I used the data you provided to re-generate the expression expression with the 2.0.5 version and compare the two values ​​to find a certain deviation. After retesting, the effect was also worse. I want to ask if you are using that version of openface?

pengweixiang avatar May 17 '19 06:05 pengweixiang

The version of OpenFace I used for this project is 2.0.4.

donydchen avatar May 17 '19 08:05 donydchen

The version of OpenFace I used for this project is 2.0.4.

I can't search this version online. Can you provide related connections? I want to retrain the model here, using the data you provide to train a model that has the same effect as you provided? Is there a requirement for training parameters?

pengweixiang avatar May 17 '19 10:05 pengweixiang

The source code of OpenFace v2.0.4 can be downloaded from https://github.com/TadasBaltrusaitis/OpenFace/releases/tag/OpenFace_2.0.4, and refer to https://github.com/TadasBaltrusaitis/OpenFace/wiki/Unix-Installation for the installation guide. Good luck.

donydchen avatar May 17 '19 10:05 donydchen

The source code of OpenFace v2.0.4 can be downloaded from https://github.com/TadasBaltrusaitis/OpenFace/releases/tag/OpenFace_2.0.4, and refer to https://github.com/TadasBaltrusaitis/OpenFace/wiki/Unix-Installation for the installation guide. Good luck.

The training results are not effective. . . It seems that luck is very important.

pengweixiang avatar May 20 '19 03:05 pengweixiang

The source code of OpenFace v2.0.4 can be downloaded from https://github.com/TadasBaltrusaitis/OpenFace/releases/tag/OpenFace_2.0.4, and refer to https://github.com/TadasBaltrusaitis/OpenFace/wiki/Unix-Installation for the installation guide. Good luck.

191942_195985 197408_200048 The result is like this, I don’t know why, ask for advice.

pengweixiang avatar May 20 '19 04:05 pengweixiang

Sorry but I don't have too much insight on your codes, settings or experiment environment, so I'm afraid I can't provide any effective suggestion for your case. But if you use the code and dataset provided by this project, it should be able to yield the similar results as shown in the READMD. I trained and tested it on several different machines before, and they all worked out fine.

donydchen avatar May 20 '19 07:05 donydchen

I am testing in google colab and trying to get this setup so the environment will not matter, I am still having trouble with action units, if anyone is interested in setting this project up to train and test in google colab feel free to contact me and then we should all be getting the same results @donydchen I can share my google colab notebook or do you plan to set this up in google colab to make it easy to reproduce and use in any environment?

ak9250 avatar May 20 '19 21:05 ak9250

Hi @ak9250 , many thanks for your suggestion. However, I'm busy doing research on other topic these days, so I'm afraid I don't have time to update the project for the time being. For extracting Action Units, you can check out https://github.com/donydchen/ran_replicate/blob/master/tools/extract_au.py for some reference. For using other dataset, you'll need to create a specific dataset Class by inheriting the base_dataset.py. Basically, you can just copy celeba.py and modify a few lines of code to adapt to your own dataset. Then call your dataset class in data_loader.py. Hope it help.

donydchen avatar May 21 '19 02:05 donydchen

Sorry but I don't have too much insight on your codes, settings or experiment environment, so I'm afraid I can't provide any effective suggestion for your case. But if you use the code and dataset provided by this project, it should be able to yield the similar results as shown in the READMD. I trained and tested it on several different machines before, and they all worked out fine.

Found the problem. Successfully achieved. thank you very much

pengweixiang avatar May 22 '19 02:05 pengweixiang

@pengweixiang hi ,I just want to know the detail of how do you fix the problem? MIne just din't work after fine-tune on the dataset of celebA(it does not change at all) I do not know why? thank you

xrtbuaa avatar Jun 03 '19 12:06 xrtbuaa

Hi. I used celeba data, face_recognition package, openface to test the consistency of aus values. I found that the way of alignment really affects those values. This method 'im.thumbnail(size, Image.ANTIALIAS)' returns(not 'return' actually) a image with its height and width smaller than 128, and happens to be a patch of the corresponding cropped face image that the author provide. Maybe is there any padding tricks or does the version of face_recognition package matter?

plutoyuxie avatar Jul 23 '19 07:07 plutoyuxie

Hi, @plutoyuxie, im.thumbnail is self-explanatory and it aims to down sample a given image. That means, for an input image, if its size is larger than 128x128, it will be down sampled to 128x128, while if it is smaller than 128x128, the original size will be retained.

If you'd like to make sure the size of an image is resized to 128x128, kindly check im.resize.

Note that before feeding to the training network, an image will always be resized to a specific shape, e.g. 128x128. So the size of images in the pre-processing may not really matter.

donydchen avatar Jul 23 '19 08:07 donydchen

Hi, @donydchen When I use size=(128,128) as parameters, 'im.thumbnail' makes an image size equal to (107, 108) (sth like this). So I'm confused. I want to test my own data, but the way of face alignment is defferent (after using im.thumbnail and im.resize, my handmade celeba image is different from yours.), so the results turns to be not that good.

I fix my preprocessing problem, just delete the line 'im = im.crop((bbox[3], bbox[0], bbox[1], bbox[2]))'. Then I get the same cropped face image as the author gives.

plutoyuxie avatar Jul 23 '19 08:07 plutoyuxie

@pengweixiang hi ,I just want to know the detail of how do you fix the problem? MIne just din't work after fine-tune on the dataset of celebA(it does not change at all) I do not know why? thank you

me too

guozhongluo avatar Sep 02 '19 11:09 guozhongluo

hi @pengweixiang,I just want to know the detail of how do you fix the problem? MIne just din't work after fine-tune on the dataset of celebA(it does not change at all) I do not know why? thank you

18810251126 avatar Jan 03 '20 05:01 18810251126

hi @pengweixiang,I just want to know the detail of how do you fix the problem? MIne just din't work after fine-tune on the dataset of celebA(it does not change at all) I do not know why? thank you

See if the default setting of your dataset is ‘none’. If yes, set resize

pongkun avatar Apr 05 '20 14:04 pongkun

每个系统里面生成的表情au参数都不一样,所以你需要重新训练调整,不能直接使用demo

发自我的iPhone

------------------ Original ------------------ From: AndyWang <[email protected]> Date: Sun,Apr 5,2020 10:13 PM To: donydchen/ganimation_replicate <[email protected]> Cc: pengweixiang <[email protected]>, Mention <[email protected]> Subject: Re: [donydchen/ganimation_replicate] My own data test is very poor (#5)

@pengweixiang @donydchen same problem, how is it solved? Thank you

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or unsubscribe.

pengweixiang avatar Apr 10 '20 00:04 pengweixiang

嗨@ak9250,非常感谢您的建议。不过这几天忙着研究其他课题,恐怕暂时没有时间更新项目。 对于提取动作单元,您可以查看https://github.com/donydchen/ran_replicate/blob/master/tools/extract_au.py以获得一些参考。 要使用其他数据集,您需要通过继承base_dataset.py创建一个特定的数据集类。基本上,您只需复制celeba.py并修改几行代码即可适应您自己的数据集。然后在data_loader.py中调用您的数据集类。 希望它有所帮助。

Can you provide OPENFACE code to extract action units

sssssshf avatar Apr 01 '22 09:04 sssssshf

每个系统里面生成的表情au参数都不一样,所以你需要重新训练调整,不能直接使用demo 发自我的iPhone ------------------ Original ------------------ From: AndyWang <[email protected]> Date: Sun,Apr 5,2020 10:13 PM To: donydchen/ganimation_replicate <[email protected]> Cc: pengweixiang <[email protected]>, Mention <[email protected]> Subject: Re: [donydchen/ganimation_replicate] My own data test is very poor (#5) @pengweixiang @donydchen same problem, how is it solved? Thank you — You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or unsubscribe.

这个au的值如何获取 可以提供openface的代码吗 感谢

sssssshf avatar Apr 01 '22 09:04 sssssshf

Did not work, the problem still exists. very strange. extract N_0000000376_N_0000000283

I meet the same problem in testing. I success in celeba from gdrive, but fail in my own dataset. And I find a way to figure it out. The reason is that I use the error parameter extracted by Openface. Here is my way:

  1. Download and install Openface from https://github.com/TadasBaltrusaitis/OpenFace/releases/tag/OpenFace_2.0.4. Crop images to 128x128.
  2. Extract the AUs with the command: ./build/bin/FaceLandmarkImg -fdir ../val_set/img_128/ -out_dir ../val_set/aus/ -aus
  3. Use the code https://github.com/albertpumarola/GANimation/blob/master/data/prepare_au_annotations.py to extract [2: 19] as the readme said. If we don't add '-aus' in step 2, we will get wrong AUs parameters here.
  4. Prepare the dataset as celeba dataset in gdrive and test. Here is my result: image

The work is interesting and the pre-trained weights are helpful. Thanks!

yuangan avatar Oct 14 '22 08:10 yuangan