PySR icon indicating copy to clipboard operation
PySR copied to clipboard

Partial derivative loss

Open nice-mon opened this issue 4 years ago • 4 comments

Hi,Miles I learned that Regression losses work on the distance between targets and predictions in PySR. However, in the regression case for having a complex formula, the regression results maybe has high complex or not conform to its essential physical laws. In the Paper: distilling free-form natural laws from experimental data, the principle for the identification of nontriviality is interesting, proposed by Michael. I wonder if this error principle can be added to PySR. If possible, the performance of PySR will be more powerful, and better and more essential results can be obtained for some regression. Thanks.

nice-mon avatar Apr 25 '21 02:04 nice-mon

What specific part of their paper are you referring to? [for reference, their paper is my favorite all-time :) ]

MilesCranmer avatar May 04 '21 19:05 MilesCranmer

yeah, in Figure.2 of the paper, they select the best equations by comparing predicted partial derivatives with numerical partial derivatives. In pysr, this is compare predicted value with train data value. So, I wonder if this method can be added to pysr.

nice-mon avatar May 05 '21 12:05 nice-mon

Good idea. I was thinking about adding this but haven't yet. Would be really interesting to try. The fact that the backend of PySR is written in Julia makes this really easy since it already has autodiff setup!

MilesCranmer avatar May 05 '21 13:05 MilesCranmer

Hi,Miles I find some codes in LossFunction.jl, which is "(prediction, completion) = differentiableEvalTreeArray(tree, dataset.X, options)". what is I am interesting in whether it used to the Partial derivative loss. If it is, I am want to know how to used it. Thanks!

nice-mon avatar Jul 05 '21 08:07 nice-mon

This should be doable now, with custom objective functions implemented: https://astroautomata.com/PySR/examples/#9-custom-objectives. (eval_diff_tree_array is the differential operator - see https://astroautomata.com/SymbolicRegression.jl/stable/api/#Derivatives).

Cheers, Miles

MilesCranmer avatar Mar 27 '23 23:03 MilesCranmer