dirt icon indicating copy to clipboard operation
dirt copied to clipboard

Example of Use in a Tensorflow NN

Open staffhorn opened this issue 6 years ago • 6 comments

Do you have an example of dirt's use as (say) the output layer of a neural net? Thanks

staffhorn avatar Nov 19 '18 08:11 staffhorn

I don't have any self-contained examples of this currently, but I agree it'd be good to add something. What task are you trying to solve? Different things could be predicted by the neural network -- vertices? poses? camera/lighting parameters? In each case, you need to parameterise suitably (e.g. if predicting vertex locations, fix the mesh topology, and restrict the motion of the vertices, produced by the output layer of the neural network, to keep the shape well-behaved). Dirt's rasterise op is then just another 'layer' that appears in your model. If you give some more info on what you're interested in doing, I can give some more specific guidance!

pmh47 avatar Nov 19 '18 23:11 pmh47

Thanks for your reply. At present, I'm interested in predicting vertices, i.e. morphing a reference shape into a target shape. In the near future, I will want to hold shape fixed and predict poses. The camera/lighting are controlled by me at present.

staffhorn avatar Nov 20 '18 17:11 staffhorn

@pmh47 do you have any working layer? I am presently working on a custom keras layer and any example would be very helpful.

staffhorn avatar Dec 12 '18 15:12 staffhorn

Unfortunately I don't have the time to prepare an 'end to end' worked example of training this kind of thing just now. However, wrapping into a keras layer is very straightforward -- rasterise and rasterise_batch are just like any other multi-input tensorflow op, e.g. matmul. So your custom layer could just wrap rasterise_batch, taking vertices, faces, etc., as input, and having no trainable parameters. Even simpler to use Lambda; here is a sketch (untested):

rasterise_layer = Lambda(
  lambda [background, vertices, vertex_colors, faces]:
    dirt.rasterise_batch(background, vertices, vertex_colors, faces)
)

Then, for example, you could have a fully-connected layer producing the vertex locations and pass these to that custom rasterise layer (again, an untested sketch, and a trivial orthographic projection):

vertex_count = 4
image_size = 32
fc = Dense(vertex_count * 3, activation=sigmoid)(your_cnn_features)
vertices = Reshape(vertex_count, 3)(fc)
vertices = Concatenate(2)([vertices, tf.ones([batch_size, vertex_count, 1)])  # convert to homogeneous coordinates by setting w = 1
vertices -= [0.5, 0.5, 0., 0.]  # centre in clip space
faces = [[0, 1, 2], [2, 3, 0]]  # a single quad
background = tf.zeros([batch_size, image_size, image_size, 3])  # black background
vertex_colours = tf.ones([batch_size, vertex_count, 3])  # white foreground
images = rasterise_layer(background, vertices, vertex_colours, faces)

pmh47 avatar Dec 12 '18 17:12 pmh47

Thanks for taking the time, Paul. No end-to-end need; this sketch is a great start; I was going much more complex than needed.

staffhorn avatar Dec 12 '18 23:12 staffhorn

I have some working examples; I will put some simple ones up in the next few days.

staffhorn avatar Dec 26 '18 16:12 staffhorn