tf_unet
tf_unet copied to clipboard
Why are there two weight initilizers?
With in layers, line 24 till 29. These two function are exactly the same, why do they both exist? def weight_variable(shape, stddev=0.1): initial = tf.truncated_normal(shape, stddev=stddev) return tf.Variable(initial)
def weight_variable_devonc(shape, stddev=0.1): return tf.Variable(tf.truncated_normal(shape, stddev=stddev))
Yes you're right this could be refactored