James Chapman

Results 100 comments of James Chapman

Yep you're right! ``` def main(): from sklearn.pipeline import Pipeline from mvlearn.compose import SimpleSplitter from mvlearn.embed import MCCA n_samples = 100 n_features = [2, 3, 4] rng = np.random.RandomState(0) Xs...

First! I should have added great work on this package :) On inverse retraction a couple of examples (though I must admit I didn't know it wasn't a standard method...

In principle I'm also happy to take a look at implementing the invretr myself for the ones I'm interested in and can put in a PR if it works for...

@W-L-W Was about to raise an issue describing our thoughts on visualisations Then discovered the previous time I thought about this @LegrandNico raised biplots. Definitely value to this. As discussed...

I have actually moved the explained variance out of the BaseModel class (it is available to models with 'PLS' style constraints). This is because the CCA models in this package...

In fact this strikes me as a good idea. And I agree with @JohannesWiesner description of explained variance having been through a similar struggle in the literature a while back...

This will be tacked onto BaseModel in the current version of the package: ``` def explained_variance(self, views: Iterable[np.ndarray]): """ Calculates the explained variance for each latent dimension. Parameters ---------- views...

Have now added in explained_variance, explained_variance_ratio, explained_variance_cumulative as well as explained_covariance, explained_covariance_ratio, explained_covariance_cumulative in the latest release This has been achieved by adding a new class attribute 'loadings' which are...

Thanks for the encouragement to fix this up @WantongLi123! It's a nice change that I think has a lot of value in post-hoc analysis of these kind of models.

Apologies for never having responded! Must have caught me at a busy time. Not done anything myself, as ever welcome any contributions that are of practical use to people :)