Gradient-Free-Optimizers icon indicating copy to clipboard operation
Gradient-Free-Optimizers copied to clipboard

add Random search parameter: replacement

Open SimonBlanke opened this issue 1 year ago • 6 comments

This aims to add a replacement parameter to the random search optimization. From what I can tell, this would completely change the way random search works. So the random search with replacement=False could require a separate implementation of the optimization class.

SimonBlanke avatar Dec 28 '24 16:12 SimonBlanke

HI Simon, If this enhancement is still needed, I would like to do it. Can you please assign me this issue? Thanks

Mujeeb4 avatar Apr 25 '25 12:04 Mujeeb4

Hello @Mujeeb4,

yes this is still needed. I'll assign it to you :-)

This feature sounds simple at first, but could be tricky. The current implementation is here: https://github.com/SimonBlanke/Gradient-Free-Optimizers/blob/master/src/gradient_free_optimizers/optimizers/core_optimizer/utils.py#L28 This would be if replacement=True. So the random search could just select the same position again in the future.

replacement=False would require a way to remember the visited positions. I could think of two ways to do this on top of my head:

  • calculate a list of all possible positions in the search space. Select a random one and remove it.
  • use the current implementation of move_random, but store the visited positions somewhere. If move_random returns an already visited position, then get another one until you get a new position from move_random

Both solutions would have disadvantages:

  • The first one can blow up your memory if the search-space gets really large (known problem for smbo). You could use sampling so solve this, but it would defeat the purpose of the replacement parameter.
  • The second one could be stuck in its while loop (if pos == visited) for a long time if most of the search-space is already visited.

I would probably go for solution 2, since it is similar to the implementation of getting a new position if the new position if inside a restricted zone (constrained optimization).

But I would suggest, that you explore for possible solutions for your self at first. Maybe you'll find a new trick for this :-)

SimonBlanke avatar Apr 25 '25 13:04 SimonBlanke

Thank you for the context and suggestions!

I'll take a detailed look into the current implementation and explore potential solutions. If I discover a more optimal approach or find that one of your proposed methods is most suitable, I’ll proceed accordingly and keep you updated.

Looking forward to working on this—excited for the challenge!

Mujeeb4 avatar Apr 25 '25 13:04 Mujeeb4

Hi @SimonBlanke I have reviewed the implementation and explored possible approaches in detail. I will proceed with the following solution:

  • Introduce a replacement parameter in the optimizer.
  • Keep the current move_random function unchanged for simplicity and consistency.
  • When replacement=False, maintain a set of visited positions and resample until a new position is found, with a retry limit to avoid infinite loops.
  • Include logging and safety checks to ensure robustness and user clarity.

This approach ensures minimal memory overhead, preserves backward compatibility, and integrates cleanly with the current structure. I'll start implementing this and keep you posted on the progress!

Mujeeb4 avatar Apr 26 '25 12:04 Mujeeb4

@Mujeeb4
If you like you can try out again to implement this feature. Beforehand it is important to familiarize yourself with the structure of the optimization classes. Please let me know how you want to proceed.

SimonBlanke avatar May 05 '25 08:05 SimonBlanke

Hi @SimonBlanke , Thank you for the opportunity, I’d be glad to proceed. I’ve been occupied with some university commitments lately, which limited my availability.

Regarding the previous implementation, I initially thought the feature was expected to be implemented within that specific file, which is why I placed it there. I now understand the need to work around the overall structure, and I’ll make sure to familiarize myself with the optimization classes before submitting another PR. Thanks

Mujeeb4 avatar May 05 '25 09:05 Mujeeb4