Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Got extremely large face when performing data augmentation #12

Open
gerwang opened this issue Sep 19, 2019 · 7 comments
Open

Got extremely large face when performing data augmentation #12

gerwang opened this issue Sep 19, 2019 · 7 comments

Comments

@gerwang
Copy link

gerwang commented Sep 19, 2019

I got extremely large face when I implemented the data augmentation method described in the paper. Here is my code

import config
import numpy as np
from lib_dr import get_dr, get_mesh
from tqdm import tqdm
import open3d as o3d
import openmesh as om


def generate_coefficients(m):
    r = np.random.uniform(0.5, 1.2)
    thetas = np.random.uniform(0, np.pi / 2, m - 1)
    c = np.repeat(r, m)
    for i in range(len(thetas)):
        c[m - i - 1] *= np.sin(thetas[i])
        c[:m - i - 1] *= np.cos(thetas[i])
    return c


n_aug = 500
m = 5

data_path = 'data/CoMA/data/FW_140/train.npy'
result_path = 'data/CoMA/data/FW_140/train_dr_{}.npy'.format(n_aug)

train_np = np.load(data_path)

everyone_template = 'data/FWH/Tester_{}/Blendshape/shape_0.obj'
mean_face_path = 'DR-Learning-for-3D-Face/data/disentangle/Mean_Face.obj'

features = []

for i in tqdm(range(1, 141)):
    feat = get_dr.get_dr(mean_face_path, everyone_template.format(i))
    features.append(feat)

features = np.array(features)

template_mesh = om.read_trimesh(mean_face_path)
mesh = o3d.geometry.TriangleMesh()
mesh.vertices = o3d.utility.Vector3dVector(template_mesh.points())
mesh.triangles = o3d.utility.Vector3iVector(template_mesh.face_vertex_indices())
mesh.compute_vertex_normals()

aug_res = []
for i in tqdm(range(n_aug)):
    c = generate_coefficients(m)
    ids = np.random.choice(features.shape[0], m, replace=False)
    samples = features[ids]
    tmp = get_mesh.get_mesh(mean_face_path, np.tensordot(samples, c, axes=[0, 0]))
    tmp = tmp.reshape(-1, 3)

    o3d.visualization.draw_geometries([mesh])
    aug_res.append(tmp)

aug_res = np.array(aug_res)
aug_res = np.concatenate([train_np, aug_res], axis=0)
print(aug_res.shape)
np.save(result_path, aug_res)

Is it a normal phenomenon?

@gerwang
Copy link
Author

gerwang commented Sep 19, 2019

image

@gerwang
Copy link
Author

gerwang commented Sep 19, 2019

I don't think meshes interpolated by ACAP should be misaligned. Is there anywhere wrong with my code?

@gerwang
Copy link
Author

gerwang commented Sep 19, 2019

image
This is the "large" case.

@gerwang
Copy link
Author

gerwang commented Sep 21, 2019

Even if I applied one's DR feature to mean face, compare to his neutral face himself, it misaligned.
image

@zihangJiang
Copy link
Owner

Hi, thanks for your interests in our work.
I tried to add something in your code to match our implementation in practice, you can have a try.
For the spatial misalign, you can refer to sec 3.3.1 in Alive Caricature from 2D to 3D for detail. It's common in our experiment, and you can use rotation and translation to align them.

import numpy as np
from lib_dr import get_dr, get_mesh
from tqdm import tqdm
import open3d as o3d
import openmesh as om


def generate_coefficients(m):
    r = np.random.uniform(0.5, 1.2)
    thetas = np.random.uniform(0, np.pi / 2, m - 1)
    c = np.repeat(r, m)
    for i in range(len(thetas)):
        c[m - i - 1] *= np.sin(thetas[i])
        c[:m - i - 1] *= np.cos(thetas[i])
    return c


n_aug = 500
m = 5

data_path = 'data/CoMA/data/FW_140/train.npy'
result_path = 'data/CoMA/data/FW_140/train_dr_{}.npy'.format(n_aug)

train_np = np.load(data_path)

everyone_template = 'data/FWH/Tester_{}/Blendshape/shape_0.obj'
mean_face_path = 'DR-Learning-for-3D-Face/data/disentangle/Mean_Face.obj'

features = []
cross_id = get_dr.get_dr(mean_face_path, mean_face_path)
for i in tqdm(range(1, 141)):
    feat = get_dr.get_dr(mean_face_path, everyone_template.format(i))-cross_id

    features.append(feat)

features = np.array(features)

template_mesh = om.read_trimesh(mean_face_path)
mesh = o3d.geometry.TriangleMesh()
mesh.vertices = o3d.utility.Vector3dVector(template_mesh.points())
mesh.triangles = o3d.utility.Vector3iVector(template_mesh.face_vertex_indices())
mesh.compute_vertex_normals()

aug_res = []
for i in tqdm(range(n_aug)):
    c = generate_coefficients(m)
    ids = np.random.choice(features.shape[0], m, replace=False)
    samples = features[ids]
    tmp = get_mesh.get_mesh(mean_face_path, np.tensordot(samples, c, axes=[0, 0])+cross_id)
    tmp = tmp.reshape(-1, 3)
    o3d.visualization.draw_geometries([mesh])
    aug_res.append(tmp)

aug_res = np.array(aug_res)
aug_res = np.concatenate([train_np, aug_res], axis=0)
print(aug_res.shape)
np.save(result_path, aug_res)

@gerwang
Copy link
Author

gerwang commented Sep 22, 2019

Thanks. I find out that I need to subtract "identity transformation" before doing linear interpolation.

At section 3.3.1, it mainly describes how get_mesh works. Does it mean that we can specify the fixed position of one point and get a single solution? This means currently the mesh outputted by get_mesh will only have an extra translation, not rotation.

@gerwang
Copy link
Author

gerwang commented Sep 22, 2019

I also found an interesting point which is that if you using get_mesh on the same "base mesh", the results seem to be properly aligned with each other.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants