deep-symbolic-optimization icon indicating copy to clipboard operation
deep-symbolic-optimization copied to clipboard

Defining custom gaussian function

Open CarvFS opened this issue 1 year ago • 3 comments

Hello,

I want to add a gaussian function to the function_set. I have done so as the other functions already defined (e.g. sigmoid):

def gaussian(x1):
    return np.exp(-np.power(x1, 2))

However, I would like to have it in the form

def gaussian(x1):
    return np.exp(-np.power(a*x1 - b, 2))

with a and b being adjustable parameters. Using mul, sub and const in the function_set eventually I would get a gaussian in this form, but there is a way of enforcing it to all gaussians in the expression?

Sincerely yours, Felipe Silva Carvalho

CarvFS avatar Sep 15 '23 16:09 CarvFS

Sorry I missed this earlier. This is an interesting question. I don't see a simple solution, but it's definitely doable by implementing custom versions of DSO core abstractions.

I think it depends on how you want to obtain values for a and b. Do you want them to be optimized with respect to the reward function every time your expression is evaluated (similarly to how const is optimized), or do you want them to be learned by the algorithm? The advantage of optimized is that you'll likely get better values, but it'll be much more computationally expensive, as there's an inner optimization loop every time you evaluate the reward of an expression with a Gaussian.

I think either way would take a bit of coding on top of the current code, but I think it would make for a nice addition. Inner-loop optimization could mimic the const token. The easiest way might be to treat the Gaussian function as a unary function, then pre-process the traversal to be a function with two const tokens (which have already been implemented). For example, the expression [add, x1, gaussian, sin, x2] would be transformed to [add, x1, exp, mul, -1, n2, sub, mul, const, x1, const] and then each const will be optimized.

Learning the values might be trickier. You could think of the Gaussian as a ternary function, where the arguments are x, a, and b. Then you could force a and b to be constants via a custom Prior. Another tricky part would be that DSO currently only supports up to binary operators. So you'd have to either settle for learning one of the parameters, or you enforce a nested binary function could using an extra function and custom Prior. In this last case, a traversal could look like [Gaussian, x1, ForcedDoubleConstants, value_for_a, value_for_b].

brendenpetersen avatar Dec 17 '23 22:12 brendenpetersen

A third way would be to use the poly token. This will be optimized analytically, meaning it is very fast and doesn't require an inner optimization loop. The function [gaussian, poly] (while configuring poly to have max degree 1) would then get you what you want. If you don't want to allow [gaussian, x1] (i.e., you always want to learn the coefficients), then you can add a custom Prior to prevent that. This might be the best way and doesn't actually require changing the code (except for adding guassian to function_set.py).

Slight downside is that the poly token won't work if the function is non-invertable, e.g. [sin, gaussian, poly] is not allowed.

brendenpetersen avatar Dec 17 '23 23:12 brendenpetersen

Thank you very much for your answer! I will try the third way you mentioned first and see what happens.

CarvFS avatar Dec 18 '23 16:12 CarvFS