qiskit-algorithms icon indicating copy to clipboard operation
qiskit-algorithms copied to clipboard

ADAM bug when calculating the gradient in batches

Open proeseler opened this issue 2 months ago • 2 comments

Environment

  • Qiskit Algorithms version: 1.0.1
  • Python version: 3.11
  • Operating system: Linux

What is happening?

There is only one small error that prevents ADAM from calculating the gradient_num_diff for batches with max_evals_grouped. ADAM only gives "fun, self._eps" as argument when calling gradient_num_diff. This leads to max_evals_grouped=None, which leads to max_evals_grouped=1. Therefore, regardless of the call to set max_evals_grouped, max_evals_grouped=1 will always apply for ADAM. Optimizer class method minimize:

image

ADAM class method gradient_num_diff:

image

Here are the corresponding files:

  • https://qiskit-community.github.io/qiskit-algorithms/_modules/qiskit_algorithms/optimizers/optimizer.html#Optimizer
  • https://qiskit-community.github.io/qiskit-algorithms/_modules/qiskit_algorithms/optimizers/adam_amsgrad.html#ADAM

How can we reproduce the issue?

Create the ADAM optimizer. Call set_max_evals_grouped with any limit with the optimizer. Now call minimize with the optimizer and you will not notice any change in the runtime/CPU usage.

What should happen?

I could quickly fix the error by adding the missing argument.

Any suggestions?

No response

proeseler avatar May 16 '24 15:05 proeseler