Using the same seed kwarg returns different values between GlorotUniform and GlorotUniformV2
System information.
- Have I written custom code (as opposed to using a stock example script provided in Keras): Yes
- OS Platform and Distribution (e.g., Linux Ubuntu 16.04): Linux Ubuntu 18.04.5 LTS
- TensorFlow installed from (source or binary): Pip (binary)
- TensorFlow version (use command below): v2.6.0-0-g919f693420e 2.6.0
- Python version: 3.7.12
- Bazel version (if compiling from source):
- GCC/Compiler version (if compiling from source):
- CUDA/cuDNN version: NA
- GPU model and memory: NA
- Exact command to reproduce: Colab notebook
Describe the problem. The random values got by GlorotUniform are different between the V1 and V2 APIs using the same method seed, as a consequence, it is not possible to get the same exact results obtained by the V1 version in V2.
Describe the current behavior.
Setting the operation seed returns different tensors when using GlorotUniform when imported from tf.compat.v1 and tensorflow.keras.initializers
Describe the expected behavior. Both tensors must be equal
- Do you want to contribute a PR? (yes/no): No
- If yes, please read this page for instructions
- Briefly describe your candidate solution(if contributing): NA
Standalone code to reproduce the issue. Colab link
Source code / logs. NA
P.S: Submitting bug here in the Keras repo due to comments in the TF repo, specifically: comment in issue 52294 in TF
@sachinprasadhs Was able to reproduce the issue on colab using TF v2.6.0 ,Please find the gist here for reference.Thank you!
Is this the old https://stackoverflow.com/questions/61960609/why-is-rng-different-for-tensorflow-2-and-1 ?
This SO article (which I included in the gist) tells us that including the operation seed causes the backend to call a different RNG function.
The problem is that GlorotUniform generates totally different values between TF1 and TF2 versions, even when using the same global and operation seed. Even If the backend is different, the result should be the same if I set up the global and operation seeds in both cases.
If I'm migrating from TF1 and TF2, then this behavior is unexpected and might lead me to think that my migration went wrong.
Yes in TF2 random_uniform is conditional https://github.com/keras-team/keras/blob/5817ec5745a844a924065c19f4a9ee1cc4a5c66e/keras/backend.py#L1825-L1833
You can reproduce the same behavior in TF2 with keras.backend.disable_generator_for_rng()
@qlzh727 This symbol is not exported right? So isn't available in tensorflow.keras but only in keras right?
Hi @fernandobperezm, This is expected behavior for the initializers themselves. So it's not a bug in our view, esp. that the random seed in not part of the official public API. However, your use case is an important one and we'll look into providing a workaround. @tomerk, any thoughts on this? Is there a known work around or can we provide one to migration users?
As I've mentioend keras.backend.disable_generator_for_rng is not part of the public API from tensorflow import keras but it is still part of import keras.
Buf if we search in the source code it is not used anywhere.
The disable_generator_for_rng is not intended for public API usage, and this is only a flag to hide the feature that is under development. We will try to turn on the feature when we are ready, and the flag is used for temp/debug usage at the moment.
import keras doesn't give you public API, it directly access the Keras package python code. User should always prefer from tensorflow import keras.
Yes but currently I think it is the only workaround that we have also if it is still not officially supported
Quick note: We're going to check the viability of including this in the migration correctness validation tooling that is already intended to aid migrating random number generation between tf1 and tf2: https://www.tensorflow.org/guide/migrate/validate_correctness
Hi all, these week's I've been super busy and haven't had the time to see the updates on this issue.
Yes in TF2
random_uniformis conditionalhttps://github.com/keras-team/keras/blob/5817ec5745a844a924065c19f4a9ee1cc4a5c66e/keras/backend.py#L1825-L1833
You can reproduce the same behavior in TF2 with
keras.backend.disable_generator_for_rng()
Thanks! I'll have a look at that. On the other hand, this made me think about a possible problem to solve, i.e., is there any way to make tf.random.Generator.from_seed(...).uniform(...) to return the same random numbers than tf.random.uniform(...)? Are the underlying algorithms different?
Hi @fernandobperezm, This is expected behavior for the initializers themselves. So it's not a bug in our view, esp. that the random seed in not part of the official public API. However, your use case is an important one and we'll look into providing a workaround. @tomerk, any thoughts on this? Is there a known work around or can we provide one to migration users?
Thanks for considering my use case. It seems a little odd that using random seeds is not part of the official public API of Keras when Tensorflow have them in their public API (I know that Keras is different than TF, but from the docs perspective they're are same). Is there any way I can help to provide a workaround?
Quick note: We're going to check the viability of including this in the migration correctness validation tooling that is already intended to aid migrating random number generation between tf1 and tf2: https://www.tensorflow.org/guide/migrate/validate_correctness
How do you guys plan to include this in the migration tutorial? I may try the same steps if they do not involve using tf-nightly (sadly, I cannot use non-released TF versions in my pipeline).
Hi all, these week's I've been super busy and haven't had the time to see the updates on this issue.
Yes in TF2
random_uniformis conditional https://github.com/keras-team/keras/blob/5817ec5745a844a924065c19f4a9ee1cc4a5c66e/keras/backend.py#L1825-L1833You can reproduce the same behavior in TF2 with
keras.backend.disable_generator_for_rng()Thanks! I'll have a look at that. On the other hand, this made me think about a possible problem to solve, i.e., is there any way to make
tf.random.Generator.from_seed(...).uniform(...)to return the same random numbers thantf.random.uniform(...)? Are the underlying algorithms different?
The tf.random.Generator are using stateless ops, which is very different from tf.random.uniform. So it is not possible to produce same result with same seed.
Hi @fernandobperezm, This is expected behavior for the initializers themselves. So it's not a bug in our view, esp. that the random seed in not part of the official public API. However, your use case is an important one and we'll look into providing a workaround. @tomerk, any thoughts on this? Is there a known work around or can we provide one to migration users?
Thanks for considering my use case. It seems a little odd that using random seeds is not part of the official public API of Keras when Tensorflow have them in their public API (I know that Keras is different than TF, but from the docs perspective they're are same). Is there any way I can help to provide a workaround?
Quick note: We're going to check the viability of including this in the migration correctness validation tooling that is already intended to aid migrating random number generation between tf1 and tf2: https://www.tensorflow.org/guide/migrate/validate_correctness
How do you guys plan to include this in the migration tutorial? I may try the same steps if they do not involve using tf-nightly (sadly, I cannot use non-released TF versions in my pipeline).
I will let @tomerk to comment more since we are working on some migration tool to help user have a fixed seed behavior.