Adds `gradient_squared` method to `FiniteVolume`
Description
Fixes #2979
Type of change
Please add a line in the relevant section of CHANGELOG.md to document the change (include PR #) - note reverse order of PR #s. If necessary, also add to the list of breaking changes.
- [ ] New feature (non-breaking change which adds functionality)
- [ ] Optimization (back-end change that speeds up the code)
- [ ] Bug fix (non-breaking change which fixes an issue)
Key checklist:
- [x] No style issues:
$ pre-commit run(or$ nox -s pre-commit) (see CONTRIBUTING.md for how to set this up to run automatically when committing locally, in just two lines of code) - [ ] All tests pass:
$ python run-tests.py --all(or$ nox -s tests) - [ ] The documentation builds:
$ python run-tests.py --doctest(or$ nox -s doctests)
You can run integration tests, unit tests, and doctests together at once, using $ python run-tests.py --quick (or $ nox -s quick).
Further checks:
- [ ] Code is commented, particularly in hard-to-understand areas
- [ ] Tests added that prove fix is effective or that feature works
Codecov Report
:x: Patch coverage is 7.14286% with 13 lines in your changes missing coverage. Please review.
:white_check_mark: Project coverage is 98.72%. Comparing base (df2dbbd) to head (48cfa9a).
:warning: Report is 1 commits behind head on develop.
| Files with missing lines | Patch % | Lines |
|---|---|---|
| src/pybamm/spatial_methods/finite_volume.py | 7.14% | 13 Missing :warning: |
Additional details and impacted files
@@ Coverage Diff @@
## develop #4540 +/- ##
===========================================
- Coverage 98.77% 98.72% -0.05%
===========================================
Files 321 321
Lines 27717 27741 +24
===========================================
+ Hits 27377 27388 +11
- Misses 340 353 +13
:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.
:rocket: New features to boost your workflow:
- :snowflake: Test Analytics: Detect flaky tests, report on failures, and find test suite problems.
@medha-14 This PR should also add tests and use the gradient squared function where appropriate
I have written a test for the gradient_squared method . Could you please advise on where I should incorporate the use of the gradient_squared method ?
I noticed that in the gradient_matrix method above the lines you touched, we could try building the gradient matrix directly in sparse format instead of converting to it later as a potential further improvement – by just pre-allocating all indices at once, and passing them to scipy.sparse.block_diag in favour of the current Kronecker product. Some part of the time there is spent in the array creation alongside the matrix construction, so maybe we can get a 2-4x improvement for larger matrices. Would you like to try that (not in this PR but in a separate one)? I'll be able to help, shall you need it.
I noticed that in the
gradient_matrixmethod above the lines you touched, we could try building the gradient matrix directly in sparse format instead of converting to it later as a potential further improvement – by just pre-allocating all indices at once, and passing them toscipy.sparse.block_diagin favour of the current Kronecker product. Some part of the time there is spent in the array creation alongside the matrix construction, so maybe we can get a 2-4x improvement for larger matrices. Would you like to try that (not in this PR but in a separate one)? I'll be able to help, shall you need it.
@agriyakhetarpal, I would like to give this a go and explore the implementation further. I’d be grateful for any additional insights or recommendations you might have as I move forward.
Any recommendations on how to improve the tests I have written for this one? or any other things I can change?
Thanks, @medha-14! The code looks fine to me as long as coverage passes – I'm happy to approve it after that. I just have one additional comment.
I will prioritize fixing the coverage as soon as possible.
@medha-14 Is this still being worked on?
Yes, I’m still on it . I’ve had a bit of a gap but will be picking it up again.