[WIP][Return-types #7] QNode integration with custom gradient and autograd
Before submitting
Please complete the following checklist when submitting a PR:
-
[ ] All new features must include a unit test. If you've fixed a bug or added code that should be tested, add a test to the test directory!
-
[ ] All new functions and code must be clearly commented and documented. If you do make documentation changes, make sure that the docs build and render correctly by running
make docs. -
[ ] Ensure that the test suite passes, by running
make test. -
[ ] Add a new entry to the
doc/releases/changelog-dev.mdfile, summarizing the change, and including a link back to the PR. -
[ ] The PennyLane source code conforms to PEP8 standards. We check all of our code against Pylint. To lint modified files, simply
pip install pylint, and then runpylint pennylane/path/to/file.py.
When all the above are checked, delete everything above the dashed line and fill in the pull request template.
Context:
Description of the Change:
Benefits:
Possible Drawbacks:
Related GitHub Issues:
[sc-25813]
Hello. You may have forgotten to update the changelog!
Please edit doc/releases/changelog-dev.md with:
- A one-to-two sentence description of the change. You may include a small working example for new features.
- A link back to this PR.
- Your name (or GitHub username) in the contributors section.
Codecov Report
Merging #3041 (26d8324) into master (37a4024) will increase coverage by
0.00%. The diff coverage is100.00%.
@@ Coverage Diff @@
## master #3041 +/- ##
========================================
Coverage 99.69% 99.69%
========================================
Files 275 275
Lines 24012 24170 +158
========================================
+ Hits 23938 24097 +159
+ Misses 74 73 -1
| Impacted Files | Coverage Δ | |
|---|---|---|
| pennylane/_grad.py | 100.00% <ø> (ø) |
|
| pennylane/gradients/__init__.py | 100.00% <100.00%> (ø) |
|
| pennylane/gradients/finite_difference.py | 100.00% <100.00%> (ø) |
|
| pennylane/gradients/parameter_shift.py | 100.00% <100.00%> (ø) |
|
| pennylane/gradients/vjp.py | 100.00% <100.00%> (ø) |
|
| pennylane/interfaces/autograd.py | 100.00% <100.00%> (ø) |
|
| pennylane/interfaces/execution.py | 100.00% <100.00%> (ø) |
|
| pennylane/qnode.py | 100.00% <100.00%> (ø) |
|
| pennylane/math/utils.py | 100.00% <0.00%> (+0.88%) |
:arrow_up: |
Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here.
@eddddddy @antalszava It is ready for a new round of reviews!
Two things I could not solve is:
- @antalszava Why numpy array with shape () are converted to float, single measurment and single params. I would appreciate your input here.
- @eddddddy The sum with axis 0 is not working with ArrayBox and differentiation, see my comment above. I could not simplify
compute_vjpmore.
Also pls go through the open comments and close the one where my answer was sufficient.
[sc-28429]