Franck Charras
Franck Charras
Here is a relevant gist of what could be a pytorch drop-in replacement for the `kneighbors` method: https://gist.github.com/fcharras/82772cf7651e087b3b91b99105a860dd Self quoting myself in the [k-means thread:](https://gist.github.com/fcharras/ce1f1df7d15675268827e1fb9b65265b): > to my knowledge the...
Here is a gist with I think a rather comprehensive implementation of the lloyd algorithm with pytorch: https://gist.github.com/fcharras/ce1f1df7d15675268827e1fb9b65265b Scroll down to the bottom of the file for a quick tester....
I came accross this same issue today, did you happen to find a workaround ?
There is actually an error in my initial snippet, in that it imports `NearestNeighbors` estimators before calling `patch_sklearn`, it should read: ``` import numpy as np import sklearn device =...
So I found out I had a version mismatch in the conda dependency tree if I don't install everything with the `-c intel` channel. It does not change the performance...
Am a bit confused by performance in latest releases, should we look at `scikit-learn-intelex==2023.2.1` or `scikit-learn-intelex==20230725.122141` ? seems that the latter is more recent but it has a different versionning...
Re-ran the scikit-learn-intelex kmeans benchmark while carefully installing scikit-learn-intelex==2023.2.1 from pip, it looks much better now :thinking: , the `20230725` comes from conda apparently. (the sheet will be synchronized in...
Adding to @betatim > This issue made me wonder about converting from one namespace to another. Say from PyTorch to Numpy. This works: > > ```python > x = array_api_compat.torch.asarray([1,2,3])...
@GaelVaroquaux suggests `RandomProjection` to transform the current array into a dense array. The dimension of the dense array is a parameter to set low enough so that the allocated memory...
Hello, I'm the contributor that proposed https://github.com/joblib/joblib/pull/1485 , I'd be interested to hear more feedback about real-world use-cases for those kind of features in joblib. Would you mind giving more...