FeatureTransforms.jl icon indicating copy to clipboard operation
FeatureTransforms.jl copied to clipboard

Can this be integrated with MLJ?

Open PyDataBlog opened this issue 3 years ago • 12 comments

It would be really neat if this idea was integrated into MLJ ecosystem in some form so Julia can have something like what Python has with scikit-learn. Custom transformers supported by the ML ecosystem. Is this possible?

PyDataBlog avatar May 25 '21 06:05 PyDataBlog

hope not. MLJ is pretty heavy. need a lighter alternative

xiaodaigh avatar Jul 17 '21 10:07 xiaodaigh

hope not. MLJ is pretty heavy. need a lighter alternative

Which part do you want stripped?

PyDataBlog avatar Jul 17 '21 10:07 PyDataBlog

with MLJ, I need to make a machine before I can do OneHotEncoding. So It's too much boiler-platy stuff for my liking

xiaodaigh avatar Jul 17 '21 10:07 xiaodaigh

with MLJ, I need to make a machine before I can do OneHotEncoding. So It's too much boiler-platy stuff for my liking

ML is rarely done in some silo but mostly some end to end pipeline. Who would want to use OHE on its own from an ML library? So it makes sense to tie everything together, I think.

PyDataBlog avatar Jul 17 '21 11:07 PyDataBlog

I see. you are meant to pipeline with a machine

xiaodaigh avatar Jul 17 '21 11:07 xiaodaigh

I see. you are meant to pipeline with a machine

Exactly! So that design logic flows through most ML libraries now. This issue would complete MLJ as a fully fledged end to end ML library. Imagine shipping one binary pipeline object like everyone does with Scikit-Learn to a production system instead of multiple objects from feature engineering steps.

PyDataBlog avatar Jul 17 '21 11:07 PyDataBlog

I think the issue of having pipelines here is separate from the issue of integrating with MLJ.jl. I personally find it more attractive to implement pipelines here as a standalone concept. MLJ.jl could then see if there is value in refactoring or supporting the pipelines from here.

juliohm avatar Oct 25 '21 10:10 juliohm

Also from the a community standpoint, it is much nicer to focus efforts on transforms in a separate hub that is detached from the huge MLJ.jl ecosystem that is already hard to follow even for experienced Julia programmers. If someone wants to add a new transform here, it is easy. Now try to do that in MLJ.jl and the person will have to first find out which package is the appropriate package, which API should be implemented, etc.

juliohm avatar Oct 25 '21 10:10 juliohm

The points raised by @juliohm are spot on and largely why this package exists.

  1. MLJ is a sprawling ecosystem which our internal codebase is not set up to adopt. Refactoring our code for MLJ and getting our researchers trained to use it would take substantially more effort with far less certainty.
  2. We wanted to make something lightweight that users could extend and adopt to suit their own use-cases (see #102). This was also motivated by the need for our own internal feature engineering packages to easily extend the API without much effort.
  3. We want to interface this with our other packages like FeatureDescriptors.jl and AxisSets.jl which are starting to comprise Invenia's "feature-engineering ecosystem" that is an alternative (at least for us) to the hegemony of MLJ.

That being said, because of (2) MLJ should be able extend this API or integrate it into its packages.

glennmoy avatar Oct 25 '21 12:10 glennmoy

FWIW, it looks like we could support that entire ecosystem by:

  1. Depending only on MLJModelInterface.jl which is a pretty minimal package
  2. Putting the MLJModelInterface.Unsupervised wrappers in a separate submodule

I don't think we'd need to change anything else about how our package works.

rofinn avatar Dec 23 '21 20:12 rofinn

FWIW, it looks like we could support that entire ecosystem by:

  1. Depending only on MLJModelInterface.jl which is a pretty minimal package

  2. Putting the MLJModelInterface.Unsupervised wrappers in a separate submodule

I don't think we'd need to change anything else about how our package works.

That would be awesome and significantly improve the Julia ML ecosystem!

PyDataBlog avatar Dec 23 '21 20:12 PyDataBlog

Just pointing out in case someone missed it... We addressed a couple of design issues in this package on a fresh new package called TableTransforms.jl, which supports composible, revertible pipelines: https://github.com/JuliaML/TableTransforms.jl

juliohm avatar Dec 23 '21 23:12 juliohm