probability icon indicating copy to clipboard operation
probability copied to clipboard

NotImplementedError: Eager execution currently not supported for SGLD optimizer.

Open shashankg7 opened this issue 5 years ago • 8 comments

I am trying to run SGLD example using eager mode, but I am getting the following error:

NotImplementedError                       Traceback (most recent call last)
<ipython-input-20-f65345166c63> in <module>()
      4     preconditioner_decay_rate=0.99,
      5     burnin=1500,
----> 6     data_size=num_samples)

/usr/local/lib/python3.6/dist-packages/tensorflow_probability/python/optimizer/sgld.py in __init__(self, learning_rate, preconditioner_decay_rate, data_size, burnin, diagonal_bias, name, parallel_iterations)
    161     ]):
    162       if tf.executing_eagerly():
--> 163         raise NotImplementedError('Eager execution currently not supported for '
    164                                   ' SGLD optimizer.')
    165 

NotImplementedError: Eager execution currently not supported for  SGLD optimizer.


Seems like SGLD is not currently implemented for SGLD. Any idea when it will be implemented?

shashankg7 avatar Oct 11 '19 02:10 shashankg7

For now, you could call it within a @tf.function-decorated callable.

Brian Patton | Software Engineer | [email protected]

On Thu, Oct 10, 2019 at 10:37 PM Shashank Gupta [email protected] wrote:

I am trying to run SGLD example using eager mode, but I am getting the following error:

NotImplementedError Traceback (most recent call last) in () 4 preconditioner_decay_rate=0.99, 5 burnin=1500, ----> 6 data_size=num_samples)

/usr/local/lib/python3.6/dist-packages/tensorflow_probability/python/optimizer/sgld.py in init(self, learning_rate, preconditioner_decay_rate, data_size, burnin, diagonal_bias, name, parallel_iterations) 161 ]): 162 if tf.executing_eagerly(): --> 163 raise NotImplementedError('Eager execution currently not supported for ' 164 ' SGLD optimizer.') 165

NotImplementedError: Eager execution currently not supported for SGLD optimizer.

Seems like SGLD is not currently implemented for SGLD. Any idea when it will be implemented?

— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/tensorflow/probability/issues/602?email_source=notifications&email_token=AFJFSI26F6V4S66BBRHB5E3QN7RFVA5CNFSM4I7UH4XKYY3PNVWWK3TUL52HS4DFUVEXG43VMWVGG33NNVSW45C7NFSM4HRC4ZMQ, or unsubscribe https://github.com/notifications/unsubscribe-auth/AFJFSIZZAJKVCSYFIQH5DKLQN7RFVANCNFSM4I7UH4XA .

brianwa84 avatar Oct 11 '19 12:10 brianwa84

I tried that too, but it gives another error: ValueError: tf.function-decorated function tried to create variables on non-first call.

shashankg7 avatar Oct 17 '19 03:10 shashankg7

@shashankg7, is the example that you are referring to Stochastic Gradient Langevin Dynamics, or something else? And were you able to resolve the issue by wrapping the SGLD optimizer in a tf.function?

Regardless, this example should be updated for TF 2.0; will tag this as an enhancement opportunity. Thank you for referencing!

dynamicwebpaige avatar Nov 09 '19 17:11 dynamicwebpaige

In this fixed? I'm still struggling to use SGLD with tf.function.

@tf.function
def train():
  optimizer = tfp.optimizer.StochasticGradientLangevinDynamics(
      learning_rate=1e-3,
      )

  for epoch in range(15):
    for x, y in train_dataset:
      with tf.GradientTape() as tape:
        y_hat = model(x)
        loss = loss_fn(y, y_hat)
      gradients = tape.gradient(loss, model.trainable_variables)
      optimizer.apply_gradients(zip(gradients, model.trainable_variables))

I got the following error: ValueError: tf.function-decorated function tried to create variables on non-first call.

Hopefully, I want to use SGLD with the Keras .fit() API, but it also doesn't work, giving the error: NotImplementedError: Eager execution currently not supported for SGLD optimizer.

iShohei220 avatar Jun 12 '20 05:06 iShohei220

https://github.com/tensorflow/probability/blob/v0.10.0/tensorflow_probability/python/optimizer/sgld.py#L273-L280 I read the implementation of SGLD, and understand why eager mode is not supported for SGLD. But based on the original pSGLD paper, the calculation of preconditioner_grads (Gamma in the paper) can be excluded because its effect is adequately small (see the right half of page 4 of https://arxiv.org/pdf/1512.07666.pdf). Many other implementations exclude the calculation of preconditioner_grads (e.g., https://pysgmcmc.readthedocs.io/en/pytorch/_modules/pysgmcmc/optimizers/sgld.html). So I propose to omit it and hope that eager mode will be supported for SGLD optimizer.

iShohei220 avatar Jun 30 '20 03:06 iShohei220

Following for reach!

Jordy-VL avatar Apr 21 '21 13:04 Jordy-VL

Any update on this?

jeffmak avatar Apr 09 '23 12:04 jeffmak

any update?

jimmykimmy68 avatar May 08 '23 13:05 jimmykimmy68