DR-Learning-for-3D-Face icon indicating copy to clipboard operation
DR-Learning-for-3D-Face copied to clipboard

Got extremely large face when performing data augmentation

Open gerwang opened this issue 5 years ago • 7 comments

I got extremely large face when I implemented the data augmentation method described in the paper. Here is my code

import config
import numpy as np
from lib_dr import get_dr, get_mesh
from tqdm import tqdm
import open3d as o3d
import openmesh as om


def generate_coefficients(m):
    r = np.random.uniform(0.5, 1.2)
    thetas = np.random.uniform(0, np.pi / 2, m - 1)
    c = np.repeat(r, m)
    for i in range(len(thetas)):
        c[m - i - 1] *= np.sin(thetas[i])
        c[:m - i - 1] *= np.cos(thetas[i])
    return c


n_aug = 500
m = 5

data_path = 'data/CoMA/data/FW_140/train.npy'
result_path = 'data/CoMA/data/FW_140/train_dr_{}.npy'.format(n_aug)

train_np = np.load(data_path)

everyone_template = 'data/FWH/Tester_{}/Blendshape/shape_0.obj'
mean_face_path = 'DR-Learning-for-3D-Face/data/disentangle/Mean_Face.obj'

features = []

for i in tqdm(range(1, 141)):
    feat = get_dr.get_dr(mean_face_path, everyone_template.format(i))
    features.append(feat)

features = np.array(features)

template_mesh = om.read_trimesh(mean_face_path)
mesh = o3d.geometry.TriangleMesh()
mesh.vertices = o3d.utility.Vector3dVector(template_mesh.points())
mesh.triangles = o3d.utility.Vector3iVector(template_mesh.face_vertex_indices())
mesh.compute_vertex_normals()

aug_res = []
for i in tqdm(range(n_aug)):
    c = generate_coefficients(m)
    ids = np.random.choice(features.shape[0], m, replace=False)
    samples = features[ids]
    tmp = get_mesh.get_mesh(mean_face_path, np.tensordot(samples, c, axes=[0, 0]))
    tmp = tmp.reshape(-1, 3)

    o3d.visualization.draw_geometries([mesh])
    aug_res.append(tmp)

aug_res = np.array(aug_res)
aug_res = np.concatenate([train_np, aug_res], axis=0)
print(aug_res.shape)
np.save(result_path, aug_res)

Is it a normal phenomenon?

gerwang avatar Sep 19 '19 13:09 gerwang

image

gerwang avatar Sep 19 '19 14:09 gerwang

I don't think meshes interpolated by ACAP should be misaligned. Is there anywhere wrong with my code?

gerwang avatar Sep 19 '19 14:09 gerwang

image This is the "large" case.

gerwang avatar Sep 19 '19 14:09 gerwang

Even if I applied one's DR feature to mean face, compare to his neutral face himself, it misaligned. image

gerwang avatar Sep 21 '19 04:09 gerwang

Hi, thanks for your interests in our work. I tried to add something in your code to match our implementation in practice, you can have a try. For the spatial misalign, you can refer to sec 3.3.1 in Alive Caricature from 2D to 3D for detail. It's common in our experiment, and you can use rotation and translation to align them.

import numpy as np
from lib_dr import get_dr, get_mesh
from tqdm import tqdm
import open3d as o3d
import openmesh as om


def generate_coefficients(m):
    r = np.random.uniform(0.5, 1.2)
    thetas = np.random.uniform(0, np.pi / 2, m - 1)
    c = np.repeat(r, m)
    for i in range(len(thetas)):
        c[m - i - 1] *= np.sin(thetas[i])
        c[:m - i - 1] *= np.cos(thetas[i])
    return c


n_aug = 500
m = 5

data_path = 'data/CoMA/data/FW_140/train.npy'
result_path = 'data/CoMA/data/FW_140/train_dr_{}.npy'.format(n_aug)

train_np = np.load(data_path)

everyone_template = 'data/FWH/Tester_{}/Blendshape/shape_0.obj'
mean_face_path = 'DR-Learning-for-3D-Face/data/disentangle/Mean_Face.obj'

features = []
cross_id = get_dr.get_dr(mean_face_path, mean_face_path)
for i in tqdm(range(1, 141)):
    feat = get_dr.get_dr(mean_face_path, everyone_template.format(i))-cross_id

    features.append(feat)

features = np.array(features)

template_mesh = om.read_trimesh(mean_face_path)
mesh = o3d.geometry.TriangleMesh()
mesh.vertices = o3d.utility.Vector3dVector(template_mesh.points())
mesh.triangles = o3d.utility.Vector3iVector(template_mesh.face_vertex_indices())
mesh.compute_vertex_normals()

aug_res = []
for i in tqdm(range(n_aug)):
    c = generate_coefficients(m)
    ids = np.random.choice(features.shape[0], m, replace=False)
    samples = features[ids]
    tmp = get_mesh.get_mesh(mean_face_path, np.tensordot(samples, c, axes=[0, 0])+cross_id)
    tmp = tmp.reshape(-1, 3)
    o3d.visualization.draw_geometries([mesh])
    aug_res.append(tmp)

aug_res = np.array(aug_res)
aug_res = np.concatenate([train_np, aug_res], axis=0)
print(aug_res.shape)
np.save(result_path, aug_res)

zihangJiang avatar Sep 21 '19 11:09 zihangJiang

Thanks. I find out that I need to subtract "identity transformation" before doing linear interpolation.

At section 3.3.1, it mainly describes how get_mesh works. Does it mean that we can specify the fixed position of one point and get a single solution? This means currently the mesh outputted by get_mesh will only have an extra translation, not rotation.

gerwang avatar Sep 22 '19 04:09 gerwang

I also found an interesting point which is that if you using get_mesh on the same "base mesh", the results seem to be properly aligned with each other.

gerwang avatar Sep 22 '19 05:09 gerwang