aeon icon indicating copy to clipboard operation
aeon copied to clipboard

[ENH] Introducing inverse_transform into StandardScaler

Open dguijo opened this issue 1 year ago • 3 comments

What does this implement/fix? Explain your changes.

Includes the possibility to perform the inverse transform to obtain the original representation.

Does your contribution introduce a new dependency? If yes, which one?

Nope

dguijo avatar Feb 20 '24 10:02 dguijo

Thank you for contributing to aeon

I have added the following labels to this PR based on the changes made: [ $\color{#41A8F6}{\textsf{transformations}}$ ]. Feel free to change these if they do not properly represent the PR.

The Checks tab will show the status of our automated tests. You can click on individual test runs in the tab or "Details" in the panel below to see more information if there is a failure.

If our pre-commit code quality check fails, any trivial fixes will automatically be pushed to your PR unless it is a draft.

Don't hesitate to ask questions on the aeon Slack channel if you have any.

aeon-actions-bot[bot] avatar Feb 20 '24 10:02 aeon-actions-bot[bot]

Hello, Matt: There is some point in having the inverse transform, but maybe not following the way this function is designed ATM.

Imagine that you have created new synthetic patterns from the training set when it was already standarised and you want to get the original representation of these synthetic patterns to do some plotting or explainability, or whatever. For this, you need to have the inverse transform.

However, this can't be achieved now as this transformer is being performed ATM, as it fits a StandardScaler per time series (n_channels x n_timepoints) instead of doing it per channel (n_timeseries x n_timepoints).

It can be used in time series reconstruction (get the original representation of the reconstructed time series), time series averaging (get the original representation of the averaged time series), and so on. I must admit it is not a typical behaviour in standard machine learning tasks, but I find it useful :smile:.

So, for me, I think we have to give a thought to the current design, and think if the other one (one StandardScaler per channel) is equally useful and then: 1) have both of them, 2) decide which to have.

Thanks!

dguijo avatar Feb 22 '24 08:02 dguijo

I think to do this properly you would have to introduce a _fit method and change the fit_is_empy tag (perhaps dynamically like you are with inverse transform?).

MatthewMiddlehurst avatar Mar 04 '24 13:03 MatthewMiddlehurst

@dguijo closing this if its ok, it will now be covered here https://github.com/aeon-toolkit/aeon/issues/2075

TonyBagnall avatar Nov 01 '24 20:11 TonyBagnall