autoemulate
autoemulate copied to clipboard
Refactor: add transforms and transformed emulators
Closes #348.
Codecov Report
Attention: Patch coverage is 91.40271% with 38 lines in your changes missing coverage. Please review.
Project coverage is 78.11%. Comparing base (
5e90347) to head (ef26951).
Additional details and impacted files
@@ Coverage Diff @@
## main #474 +/- ##
==========================================
+ Coverage 77.51% 78.11% +0.59%
==========================================
Files 121 130 +9
Lines 8758 9198 +440
==========================================
+ Hits 6789 7185 +396
- Misses 1969 2013 +44
:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.
:rocket: New features to boost your workflow:
- :snowflake: Test Analytics: Detect flaky tests, report on failures, and find test suite problems.
Coverage report
This report was generated by python-coverage-comment-action
Click to see where and how coverage changed
File Statements Missing Coverage Coverage
(new stmts)Lines missing
autoemulate/experimental/emulators
__init__.py
base.py
autoemulate/experimental/emulators/neural_processes
conditional_neural_process.py
autoemulate/experimental/emulators/transformed
base.py
180-182, 351-352, 375-378, 383-388, 392-396
autoemulate/experimental/transforms
__init__.py
base.py
35-36, 43-44
pca.py
53-55
standardize.py
45-47
utils.py
60, 67-75
vae.py
90, 118-122, 142-144
tests
test_compare.py
tests/experimental
conftest.py
test_experimental_conditional_neural_process.py
test_experimental_transformed.py
tests/experimental/transforms
test_transforms.py
Project Total
Check out this pull request on ![]()
See visual diffs & provide feedback on Jupyter Notebooks.
Powered by ReviewNB
Following discussion:
- It would be good to double check if implications of using torch
Transformfor aVAEsince the call and inv methods do not guarantee $f^{-1}(f(x)) = x$ make_positive_definitemight be useful to pass params too but we can revisit if there are issues that arise in using- Subclassing
ComposeTransformtoAutoEmulateComposeTransformmight benefit the API but this could potentially be considered as part of #531 if it seems useful in the top-level API?
One last minor thought I had was about whether we want the input args to the inverse transform methods to be z rather than x or y? It might just be easier to track what's what but I am also more than happy to be told that's wrong.
One last minor thought I had was about whether we want the input args to the inverse transform methods to be
zrather thanxory? It might just be easier to track what's what but I am also more than happy to be told that's wrong.
Yeah I think this could definitely be clearer, perhaps the following refactor would be useful:
- Since torch's
Transformrefers toxin the domain andyin the codomain, I think theAutoEmulateTransformmight be best if consistent with this and update the input API toxfor call andyfor inverses - For the
TransformedEmulator, since this works withxfor emulator inputs andyfor targets, usingzto refer to inputs in the codomain of the transforms forxseems like a good option. - Some of the method names in
TransformedEmulatormight be worth revisiting too (e.g. some refer toylike_inv_transform_y_mvnwhile the y transforms in the attribute are calledtarget_transforms) - perhaps we could rename theyin method names totargetfor consistency. Or alternativelutarget_transformstoy_transforms.
@radka-j: thanks for the review comments! I've aimed to address these now and also more generally:
- Added docstrings for all
_methodsinTransformedEmulatorandAutoEmulateTransform - Renamed function arguments / attributes in
AutoEmulateTransformandTransformedEmulator - Renamed methods in
AutoEmulateTransformfor clarity
Thanks @radka-j!