PySR
PySR copied to clipboard
Partial derivative loss
Hi,Miles I learned that Regression losses work on the distance between targets and predictions in PySR. However, in the regression case for having a complex formula, the regression results maybe has high complex or not conform to its essential physical laws. In the Paper: distilling free-form natural laws from experimental data, the principle for the identification of nontriviality is interesting, proposed by Michael. I wonder if this error principle can be added to PySR. If possible, the performance of PySR will be more powerful, and better and more essential results can be obtained for some regression. Thanks.
What specific part of their paper are you referring to? [for reference, their paper is my favorite all-time :) ]
yeah, in Figure.2 of the paper, they select the best equations by comparing predicted partial derivatives with numerical partial derivatives. In pysr, this is compare predicted value with train data value. So, I wonder if this method can be added to pysr.
Good idea. I was thinking about adding this but haven't yet. Would be really interesting to try. The fact that the backend of PySR is written in Julia makes this really easy since it already has autodiff setup!
Hi,Miles I find some codes in LossFunction.jl, which is "(prediction, completion) = differentiableEvalTreeArray(tree, dataset.X, options)". what is I am interesting in whether it used to the Partial derivative loss. If it is, I am want to know how to used it. Thanks!
This should be doable now, with custom objective functions implemented: https://astroautomata.com/PySR/examples/#9-custom-objectives. (eval_diff_tree_array is the differential operator - see https://astroautomata.com/SymbolicRegression.jl/stable/api/#Derivatives).
Cheers, Miles