Surrogates.jl
Surrogates.jl copied to clipboard
Gradient-enhanced surrogates
- [x] Gradient enhanced kriging (https://smt.readthedocs.io/en/latest/_src_docs/surrogate_models/gekpls.html)
- [ ] Gradient enhanced neural networks (https://smt.readthedocs.io/en/latest/_src_docs/surrogate_models/genn.html)
Nice thanks, they could be good assignment for the MLH students
I would like to work on Gradient Enhanced Kriging.
@ChrisRackauckas and @ranjanan - I've coded up a very rough version of GEKPLS as a bunch of functions here.
All of my code is a translation of the SMT python code. The code is not fully tested. And there are still some kinks and bugs that I'm working out but I thought I'd share this early and get feedback :)
In the example that I have provided in the gist, the underlying function simply returns x1^2 + x2^2 + x3^2 (given an array with components x1,x2 and x3)
If this looks okay, I can begin to add it as a surrogate and begin refining and optimizing the code.
Apart from the above RFC, I also have a question:
I'm using ScikitLearn's PLS (@sk_import cross_decomposition: PLSRegression). This will require us to add the following line in the main Surrogates.jl file: __precompile__(false)
This is needed because of this issue in ScikitLearn.jl
I hope this is okay?
It's a start. We won't want the final version to use ScikitLearn as that would cause some packaging issues (PyCall is hard to build into sysimages for example). But to get a working version, that's a good way to start, then add some tests, and replace pieces one-by-one.
OK. I'll start cleaning this up and search for an alternative for ScikitLearn PLS. Thanks!
@ChrisRackauckas - I've created a draft pull request for GEKPLS with some basic tests added in. This still uses the SK Learn PLS Regressor which I plan to replace. With regard to a Julia PLS Regressor there is one called - PartialLeastSquaresRegressor.jl - but it has a few issues (ex: it does not have an attribute called 'x_rotations' which is what we use from SKLearn's PLS).
Hence, I'm now planning on writing our own PLS function based on the SKLearn PLS. I plan to take only the parts that are needed for GEKPLS. Is this approach of writing our own PLS function okay?
That sounds great.