CAVE
CAVE copied to clipboard
MDS for Configurator footprint only with randomly sampled configurations
Please train the MDS mapping only on randomly sampled configurations (not on the ones optimized wrt EI). Of course, please still plot all configurations.
(As a reminder of what we discussed last week).
sklearns MDS does not provide a transform
-method, though there is an open PR: https://github.com/scikit-learn/scikit-learn/pull/9834
I could try to work on the open PR in sklearn to fix this, but I will first look for another implementation/ an alternative algorithm, since I never commited to sklearn before.
So looking further into this yielded a more fundamental problem. The MDS-algorithm reduces dimensionality by shifting points around iteratively until the distances show more or less the same distance-matrix as in the higher-dimensional space. Inserting points afterwards is not easily feasible since we have to shift points again. We could, maybe, use a Kernel-function to approximate the mapping (following this idea).
We could, maybe, use a Kernel-function to approximate the mapping (following this idea).
Sounds interesting. Is an implementation available? If not, how much time do you expect you will need to implement it?
@shukon that paper mentioned was the basis for https://github.com/scikit-learn/scikit-learn/pull/9834 by @webdrone which switches the method of computing MDS (no longer using smacof). You may find the code in that PR to be useful.