Adding exceptions classes
Before submitting a pull request (PR), please read the contributing guide.
Description
Begin to implement custom exceptions as mentioned in brainglobe/brainreg#82. Started by a custom exception for failed file loading. Currently checks whether the user is trying to pass a directory containing a single .tiff (e.g. 3D .tiff), in which case gives a message to pass the path including filename. Otherwise, it will print the origional traceback with an additional line indicating the problem is with file loading.
Some points to discsuss:
- it could be worth pre-checking inputs and raising an error there, rather than wait to fail on file loading, although this might come with performance penality.
- In the traceback, all attempted image files could be loaded e.g. with PIL (require new dependency), to check their size - this way a tailored traceback can be given.
- A couple of functions in this class could be moved to a more general utils module
- it would be useful to have a config of accepted file extensions (e.g. .tif, .tiff)
What is this PR
- [ ] Bug fix
- [X ] Addition of a new feature
- [ ] Other
Why is this PR needed? see issue brainglobe/brainreg#82
What does this PR do? Implements custom exception class for load file errors
References
Please reference any existing issues/PRs that relate to this PR. brainglobe/brainreg#82
How has this PR been tested?
pytest suite
Please explain how any new code has been tested, and how you have ensured that no existing functionality has changed. pytest suite, and the new code is only called when file loading fails (i.e. when there is already an issue)
Is this a breaking change?
No
If this PR breaks any existing functionality, please explain how and why. NA
Does this PR require an update to the documentation?
No
If any features have changed, or have been added. Please explain how the documentation has been updated (and link to the associated PR). See here for details.
Checklist:
- [X ] The code has been tested locally
- [ ] Tests have been added to cover all new functionality (unit & integration)
- [ ] The documentation has been updated to reflect any changes
- [X ] The code has been formatted with pre-commit
Kudos, SonarCloud Quality Gate passed! 
0 Bugs
0 Vulnerabilities
0 Security Hotspots
0 Code Smells
No Coverage information
0.0% Duplication
@JoeZiminski is this ready to review?
Codecov Report
Base: 92.74% // Head: 92.97% // Increases project coverage by +0.22% :tada:
Coverage data is based on head (
333da9c) compared to base (701fa7f). Patch coverage: 100.00% of modified lines in pull request are covered.
Additional details and impacted files
@@ Coverage Diff @@
## master #93 +/- ##
==========================================
+ Coverage 92.74% 92.97% +0.22%
==========================================
Files 13 14 +1
Lines 524 555 +31
==========================================
+ Hits 486 516 +30
- Misses 38 39 +1
| Impacted Files | Coverage Δ | |
|---|---|---|
| brainreg/exceptions.py | 100.00% <100.00%> (ø) |
|
| brainreg/main.py | 100.00% <100.00%> (ø) |
|
| brainreg/utils/preprocess.py | 95.83% <0.00%> (-4.17%) |
:arrow_down: |
| brainreg/backend/niftyreg/run.py | 94.44% <0.00%> (ø) |
|
| brainreg/cli.py | 98.80% <0.00%> (+0.07%) |
:arrow_up: |
Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here.
:umbrella: View full report at Codecov.
:loudspeaker: Do you have feedback about the report comment? Let us know in this issue.
@JoeZiminski is this ready to review?
Hey @adamltyson nearly ready now, outstanding points are
-
I updated tests but still test coverage is reduced, do you know why this might be? (though for some reason, this seems to have dissapeared from below tests, so maybe its okay)
-
in the tests (test_exceptions.py, get_default_brainreg_args()) the syntax is very close to what you already have in fixture format, but because only path changes I use function here. Might be a nice way to combine these, but maybe the use cases are too distinct (as, because we are expecting error, we do not really care about the other args).
If these points are more suitable for 'ready to review' happy to change now as I dont think much more code needs to be added.
I updated tests but still test coverage is reduced, do you know why this might be? (though for some reason, this seems to have dissapeared from below tests, so maybe its okay
The comment from the bot suggests that the coverage is no longer being reported, so we don't know what it is:
❗ Current head https://github.com/brainglobe/brainreg/commit/2a629a88feeeb0522d2fae1449439c2068eed90c differs from pull request most recent head https://github.com/brainglobe/brainreg/commit/8dc3b963b6431e7edf8ec53ccf79b662d4a12237. Consider uploading reports for the commit https://github.com/brainglobe/brainreg/commit/8dc3b963b6431e7edf8ec53ccf79b662d4a12237 to get more accurate results
in the tests (test_exceptions.py, get_default_brainreg_args()) the syntax is very close to what you already have in fixture format, but because only path changes I use function here. Might be a nice way to combine these, but maybe the use cases are too distinct (as, because we are expecting error, we do not really care about the other args).
If you can reuse the fixtures, I would, but don't worry if it's a hasssle.
I'll review properly once it's ready, but one thing I noticed, the new tests are in test_integration. These aren't really inetgration tests (i.e. testing the whole run of the software. Could you put them in a new directory test_unit`?
Kudos, SonarCloud Quality Gate passed! 
0 Bugs
0 Vulnerabilities
0 Security Hotspots
0 Code Smells
No Coverage information
0.0% Duplication
I've left a couple of comments. I also think that the test_unit dir should contain an __init__.py file. I seem to remember this was needed for reporting code coverage. We should be consistent anyway (test_integration has it).
Thanks pushed commit fixing those issues, will check out that failing test case, unsure why that is
In the dataset used in tests, when the single tiff is loaded, imio does not fail but the size of the output == 0 (which is checked). In the tutorial dataset, the image is loaded sucessfully as a (1, 154, 217) size array. Would it be sensible to a) check for cases that the first dimension == 1, and also raise error? Or, b) try - except niftyreg, and then in the exception class check if the first dimension == 1? I guess the first option is neater, but maybe there are cases when we want to load a single-dimension tiff?
I'd go with the first option. brainreg is a 3D registration library, I'm happy to not try to support 2D data in this way.
Added the single dimension case as np.min(target_brain.shape) == 1 as I didn't want to make assumption singleton is on the first dimension, I guess we never want a case where a dimension has N = 1 anyways. But, if the volume dimension is always in the first position it might be neater just to check that explicitly target_brain.shape[0] == 1. It turns out with the allen 100um, the files load with array size == 0, but for allen 50um they load with 3D array (1 singletom dim), I guess because of the scaling.
For the new test case, used the allen 50um to make sure both conditions are tested. However, I suppose this adds another test dependency (downloading the 50 um atlas)
@JoeZiminski when I run these with example data, I don't think I get the correct errors:
brainreg tests/data/input/exceptions/one_2d_tiff/loads_1_dim/image_0000.tif ~/Desktop -v 50 50 50 --orientation psl
gives:
File at tests/data/input/exceptions/one_2d_tiff/loads_1_dim/image_0000.tif failed to load. Ensure all image files contain the same number of pixels. Full traceback above.
and
brainreg tests/data/input/exceptions/one_2d_tiff/loads_no_dim/ ~/Desktop -v 50 50 50 --orientation psl
Produces a NiftyReg error
Kudos, SonarCloud Quality Gate passed! 
0 Bugs
0 Vulnerabilities
0 Security Hotspots
0 Code Smells
No Coverage information
0.0% Duplication
As discussed, just check whether a directory with a single file is attempted to be loaded. If so, raise error.
Otherwise, use imio to try and load. If this fails, inform the user this is where it failed and indicate they need to include a folder of .tiff file where all pixels are the same size.
Hopefully this will avoid the issues above and be much more robust. I tested this on the files I have but please let me know if anything doesn't look right, or the exception messages are not clear.
Currently there are two functions in exceptions.py that could probably go somewhere else, I think we did discuss putting them elsewhere in the brainglobe ecosystem.
Thanks @JoeZiminski.
At some point I think these functions could go into imio, but for now, I'm happy to merge.