Optim.jl icon indicating copy to clipboard operation
Optim.jl copied to clipboard

incoherent documentation for SimulatedAnnealing

Open etiennedeg opened this issue 3 years ago • 4 comments

Hi,

From the documentation :

The constructor takes 3 keywords:

  • neighbor = a!(x_proposed, x_current), a mutating function of the current x, and the proposed x
  • T = b(iteration), a function of the current iteration that returns a temperature
  • p = c(f_proposal, f_current, T), a function of the current temperature, current function value and proposed function value that returns an acceptance probability

In the implementation, T is denoted temperature, p is not implemented and it accept the key-word keep_best which is not documented.

Should the p keyword be implemented ?

etiennedeg avatar Jun 02 '21 12:06 etiennedeg

Huh, I totally failed to see that in the original PR https://github.com/JuliaNLSolvers/Optim.jl/pull/650/files . You're right, it's a mess. p could be added yes, but it should be call proposal instead. Is it something you'd find useful? Would you want to take a stab at implementing it?

pkofod avatar Jun 08 '21 07:06 pkofod

I can do that when I will have the time. I don't think proposal is the more suitable term, a better term would be something like probability_of_acceptance, or just acceptance_function as this function can be deterministic. This is useful, as kirkpatrick is the historic scheme, but other acceptance function can give better results (threshold, Tsallis...). In any case, I already adapted the code to suit my needs, as I have a discrete problem (which can be circumvented by passing floats), but there was no way to update the cost efficiently. (In my case, knowing the neighbor I generate, I can update the cost more efficiently than recomputing it from scratch). I did not find any library dealing with a very generic implementation of meta heuristics, especially for discrete problems.

etiennedeg avatar Jun 08 '21 08:06 etiennedeg

In any case, I already adapted the code to suit my needs, as I have a discrete problem (which can be circumvented by passing floats), but there was no way to update the cost efficiently. (In my case, knowing the neighbor I generate, I can update the cost more efficiently than recomputing it from scratch). I did not find any library dealing with a very generic implementation of meta heuristics, especially for discrete problems.

Can you show me a description of your problem?

pkofod avatar Jun 09 '21 08:06 pkofod

Can you show me a description of your problem?

I have two graphs G1 and G2 (with equal number of vertices) I want to find a mapping between vertices of G1 with vertices of G2 minimizing a cost function (here, maximizing the common edges from the graph mapping). Calculating this cost function is proportional to the number of edges. In the vector I pass as an argument, the i-th value is the vertex of G2 associated with the i-th vertex of G1. I generate a neighbor by swapping two elements. I can update the cost efficiently knowing the vertices I swap, i just need to consider the neighbors (in the graph) of the considered vertices.

I understand that this library is targeted to continuous optimization, and such optimizations in cost function evaluation are probably less common, so I don't want to twist the design of this library towards discrete optimization, the better would probably be to have a standalone library with very generic metaheuristics.

I took a look at the code and I have a few interrogations : keep_best is not implemented (as notices by a comment), so we should either remove it (breaking ?), or implement it (not breaking because this is not the expected behavior)

If we implement the acceptance function as a parameter, temperature feels weird, because this is just an intermediate variable of the Kirkpatrick scheme. We compute the temperature from the iteration number, then we compute the acceptance probability. However, if we remove it, this is breaking. Also, if a user wants to just change the cooling factor, he needs to reimplement the whole acceptance function. We can help by putting the default implementation of the acceptance function in the documentation, but still.

There is two possible design for the acceptance function, we can either ask for the acceptance_probability, and the random number is generated on Optim.jl side (we can output 0 or 1 for deterministic scheme), or we can ask for the acceptance function (and the user deal with the random number generation). I'm more incline for the first option.

etiennedeg avatar Jun 09 '21 20:06 etiennedeg