tf_unet icon indicating copy to clipboard operation
tf_unet copied to clipboard

Why are there two weight initilizers?

Open wagenrace opened this issue 8 years ago • 1 comments

With in layers, line 24 till 29. These two function are exactly the same, why do they both exist? def weight_variable(shape, stddev=0.1): initial = tf.truncated_normal(shape, stddev=stddev) return tf.Variable(initial)

def weight_variable_devonc(shape, stddev=0.1): return tf.Variable(tf.truncated_normal(shape, stddev=stddev))

wagenrace avatar Jun 14 '17 12:06 wagenrace

Yes you're right this could be refactored

jakeret avatar Jun 15 '17 19:06 jakeret