framework-reproducibility
                                
                                
                                
                                    framework-reproducibility copied to clipboard
                            
                            
                            
                        garder14/byol-tensorflow2 (batch-norm & softmax/cross-entropy)
Running TF 2.4.1 with seeds and envs set I'm getting different results each run for this guy:
https://github.com/garder14/byol-tensorflow2
I currently suspect it's the gradient tape. Not sure how to handle that. Would downgrading TF version help?
Thoughts welcome.
Sorry for the delay in responding, Phil; I was on vacation.
I have not run this code or got into debugging it. Just from looking at it, I can see a couple of likely sources of nondeterminism:
tf.keras.layers.BatchNormalizationis instantiated in five places inmodels.py. This layer uses fused batch-norm functionality, which is nondeterministic when being used for fine-tuning. I don't know under exactly what circumstances that is exposed by the Keras layer and, since I wasn't aware of this exposure until now, I have yet to documented it.tf.nn.sparse_softmax_cross_entropy_with_logitsis used inlinearevaluation.pyon the output of the ClassificationHead. This op will introduce nondeterminism, and there is a work-around for it.
Answering your specific questions/comments:
- "Would downgrading TF version help?": No, and downgrading is very unlikely to ever help. We're trying hard to avoid regressions regarding determinism.
 - "I currently suspect it's the gradient tape": Gradient tape just means something in the backprop. Both of the above-mentioned sources would lead to an introduction of noise in the backprop.