qhbm-library icon indicating copy to clipboard operation
qhbm-library copied to clipboard

Jacobians and expectations

Open zaqqwerty opened this issue 4 years ago • 6 comments

In PR #70, I added a test that calls the jacobian method of a tf.GradientTape on the result of a TFQ calculation. The test passes if I do not use the @tf.function decorator. However, when I add the decorator, I get the following error at the end of a long chain:

LookupError: No gradient defined for operation 'TfqAdjointGradient' (op type: TfqAdjointGradient)

This is strange, because in the test function, there is only one gradient tape, so no gradient of 'TfqAdjointGradient' should be attempted.

I have been trying to recreate the error with a simplified scenario in this colab notebook, but so far I cannot replicate it.

zaqqwerty avatar Aug 18 '21 04:08 zaqqwerty

@jaeyoo I was wondering if you might have ideas about what could cause this from the TF or TFQ side of things

zaqqwerty avatar Aug 18 '21 05:08 zaqqwerty

Yes, I've already been dealing with the issue. Please let me add it for it...

jaeyoo avatar Aug 18 '21 05:08 jaeyoo

@sahilpatelsp can you provide a detailed explanation here of how you understand the issue and how you addressed it with your trivial None check? Good for docs

farice avatar Aug 29 '21 00:08 farice

Well I am not entirely sure the issue is resolved in the case of RNN hypernetworks for QHBMs as I am currently running into a gradient issue, but it does seem to be working for standard QMHL and VQT training routines. I believe the issue was that the qhbm.trainable_variables are only recognized as variables for tf.custom_gradient with tf.function. Therefore, if we do not decorate with tf.function, then the variables argument of the inner grad function would accordingly be None, and therefore we should simply just return a single set of gradients for the input, which is qhbm.trainable_variables. Otherwise, we should also return another set of gradients for the variables, which are again qhbm.trainable_variables.

sahilpatelsp avatar Aug 29 '21 00:08 sahilpatelsp

Maybe another lead from this link: "It should be noted tf.GradientTape is still watching the forward pass of a tf.custom_gradient, and will use the ops it watches. As a consequence, calling tf.function while the tape is still watching leads to a gradient graph being built. If an op is used in tf.function without registered gradient, a LookupError will be raised."

zaqqwerty avatar Jan 08 '22 00:01 zaqqwerty

See https://github.com/tensorflow/quantum/issues/667

zaqqwerty avatar Feb 09 '22 18:02 zaqqwerty