Add tests for density evaluations
Steps
- [x] Write a good description of what the PR does.
- [x] Add tests for each unit of code added (e.g. function, class)
- [x] Update documentation
- [x] Squash commits that can be grouped together
- [x] Rebase onto master
Description
There already is a test that compares density evaluation with horton, but the density matrix used was an identity matrix. Just in case, the density evaluations are compared for both spherical and cartesian contractions for random symmetric density matrix.
There were some suspicions that either the density evaluations or the orbital evaluations are giving the wrong results. This PR confirms that the density evaluations are correct (at least in comparison to HORTON).
Type of Changes
| Type | |
|---|---|
| ✓ | :bug: Bug fix |
Related Issue
Depends on PR #93.
Codecov Report
All modified and coverable lines are covered by tests :white_check_mark:
Project coverage is 99.64%. Comparing base (
03eff2c) to head (e04378d). Report is 172 commits behind head on master.
Additional details and impacted files
@@ Coverage Diff @@
## master #94 +/- ##
=======================================
Coverage 99.64% 99.64%
=======================================
Files 28 28
Lines 1408 1408
Branches 324 324
=======================================
Hits 1403 1403
Misses 4 4
Partials 1 1
:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.
@maximilianvz I just invited you to gbasis team until you accept it. I won't be able to assign you as a reviewer. Please review this PR, after PR #140 is merged. Please commit any changes required (instead of adding comments) that you believe are needed to approve this PR. Thanks.
@FarnazH, I have taken a look at this PR, and while there are obvious, easy changes to be made regarding coord_types as per PR #140, there seems to be a problem with assertions here:
In the state of the repo at the time that this PR was made, the assertions pass fine. However, in the repo's current state, evaluate_density outputs zeros in its return that weren't present at the time of this PR, causing the assertion to fail. For example, in this assertion (after making appropriate API changes), at indices [ 18, 22, 23, 35, 40, 41, 42, 51, 52, 53, 60, 65, 66, 75, 76, 77, 80, 82, 100, 101, 105, 106, 107, 110], evaluate_density(density_matrix, cartesian_basis, grid_3d) (non-highlighted array) and horton_density (highlighted array) have the following, differing values:
I'm guessing that, at some point in the 4.5 years since this PR was created, something about evaluate_density changed such that this assertion no longer passes. Why this is the case is beyond me, so I encourage others to look into it.
I looked into this and, if I checked this correctly, this test fails because the correct output has negative values and we are making those zero now (see https://github.com/theochem/gbasis/blob/466ec1fe3c65c00ff6a09405ce3d8567a19fde61/gbasis/evals/density.py#L96). You can see in @maximilianvz screenshoot that all those values are negative. Based on the PR, looks like the reference data comes from Horton, using a "random symmetric density matrix". I am not sure what this random means here, if it is not physically relevant, meaning if a "not random" symmetric density matrix is used we are not going to have this large negative values. From what I know density can not be negative and that is why we were filtering the output for those numbers. @FarnazH, @PaulWAyers any insights into this would be good to see if we should keep this test or not.
I think the test should use a DM from a data file. We should also only test positive density values, though we should test that the correct behavior is observed for negative density values too.
The other choice is to build an ANO-centric example along the lines I recommended elsewhere.
@marco-2023 there is an $\ce{HHe}$ test that is referred to here. It may be what you are looking for so you may be able to use the same example, but I would do $\ce{HHe+}$, as $\ce{HHe}$ is sort of boring (and is harder, besides).