pennylane icon indicating copy to clipboard operation
pennylane copied to clipboard

Make GlobalPhase not differentiable

Open Tarun-Kumar07 opened this issue 9 months ago • 5 comments

Context: When using the following state preparation methods (AmplitudeEmbedding, StatePrep, MottonenStatePreparation) with jit and grad, the error ValueError: need at least one array to stack was encountered.

Description of the Change: All state preparation strategies used GlobalPhase under the hood, which caused the above error. After this PR, GlobalPhase may not be differentiable anymore, as its grad_method is set to None.

Benefits:

Possible Drawbacks:

Related GitHub Issues: It fixes #5541

Tarun-Kumar07 avatar May 02 '24 01:05 Tarun-Kumar07

Thanks for this @Tarun-Kumar07

For the failures due to errors:

ValueError: Computing the gradient of circuits that return the state with the parameter-shift rule gradient transform is not supported, as it is a hardware-compatible method.

That would be expected, and we should shift the measurement to expectation values.

For the failures due to:

FAILED tests/templates/test_state_preparations/test_mottonen_state_prep.py::test_jacobian_with_and_without_jit_has_same_output_with_high_shots[StatePrep] - AssertionError: assert Array(False, dtype=bool)
 +  where Array(False, dtype=bool) = <function allclose at 0x7fdb77481940>(Array([-0.0003,  0.0153, -0.0153,  0.0003], dtype=float64), Array([ 1.0187, -0.9953, -1.0047,  0.9813], dtype=float64), atol=0.02)

Those are legitimately different results, so we can safely safe we are getting wrong results in that case 😢 I'll investigate.

albi3ro avatar May 02 '24 21:05 albi3ro

Those are legitimately different results, so we can safely safe we are getting wrong results in that case 😢 I'll investigate.

@Tarun-Kumar07 @albi3ro Not sure you got to this yet, but it seems that the decomposition of those state preparation methods handle special parameter values differently than others. This makes the derivative wrong at those special values, because param_shift is handed a tape that does not contain the general decomposition, so it will not shift all operations that need shifting. As JITting does not allow such special cases, we only make this mistake without JITting, hence the difference in the results within those tests.

Basically, the decomposition does something like the following decomposition for RZ:

def compute_decomposition(theta, wires):
    if not qml.math.is_abstract(theta) and qml.math.isclose(theta, 0):
        return []
    return [qml.RZ(theta, wires)]

It's correct but it does not have the correct parameter-shift derivative at 0.

This looks like an independent bug to me, and like one that could be hiding across the codebase for other ops as well, theoretically.

dwierichs avatar May 03 '24 07:05 dwierichs

@Tarun-Kumar07 Sorry for taking so long with this! We decided to move ahead with this PR as you originally drafted it (with grad_method=None). Before this PR can be merged, the fix in #5774 will need to be merged in, though. I took the liberty of using a test you wrote here for that PR, your work is much appreciated! :pray:

dwierichs avatar May 31 '24 14:05 dwierichs

Hey @dwierichs , once the PR #5774 is merged I will revert changes to grad_method=None. Thanks for updating :)

Tarun-Kumar07 avatar May 31 '24 15:05 Tarun-Kumar07

@Tarun-Kumar07 It is merged :) I took the liberty to merge the master branch into your branch, updating the tests you wrote (and that made it into #5774) accordingly.

dwierichs avatar Jun 13 '24 00:06 dwierichs

Hi @Tarun-Kumar07 , Do you know when you might have time to get back to this? :) Or would you prefer us to finalize it?

dwierichs avatar Jul 24 '24 12:07 dwierichs

Hi @dwierichs,

I am currently tied up until August 17th. After that, I will be able to work on this. If this is a priority, I suggest assigning someone else to take over in the meantime.

Thank you for understanding.

Tarun-Kumar07 avatar Jul 24 '24 15:07 Tarun-Kumar07

Thanks @Tarun-Kumar07. That is very understandable of course! I'll try and push your PR through, as it is closed to finish anyways, I think. Thank you for your contribution, looking forward to the next one :star_struck:

dwierichs avatar Jul 25 '24 13:07 dwierichs

Codecov Report

All modified and coverable lines are covered by tests :white_check_mark:

Project coverage is 99.65%. Comparing base (20eed81) to head (a7d2aeb).

Additional details and impacted files
@@           Coverage Diff            @@
##           master    #5620    +/-   ##
========================================
  Coverage   99.65%   99.65%            
========================================
  Files         430      430            
  Lines       41505    41210   -295     
========================================
- Hits        41362    41069   -293     
+ Misses        143      141     -2     

:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.

codecov[bot] avatar Jul 28 '24 10:07 codecov[bot]

Thank you so much for this contribution @Tarun-Kumar07!! 🚀

Alex-Preciado avatar Jul 30 '24 17:07 Alex-Preciado