AttributeError: 'NoneType' object has no attribute 'name'
when i run python 06_modern_convnet.py , I got this error. I don't understand what happen and I just test batch normalization layer. Please help me. Thanks!
Errors appear in
File "/home/jaychao/code/tensorflow_tutorials/python/libs/batch_norm.py", line 58, in batch_norm
lambda: (ema_mean, ema_var))
I think the reason is the difference of version!
Got the same thing! Anyone knows how to port over? Tried the official migration tool on the libs folder with no luck
I am also facing the same error. Please help to resolve the issue.
Hi, I also encountered this problem, I solved it by moving the line
53 ema_apply_op = ema.apply([batch_mean, batch_var])
to make it after line 42 ema = tf.train.ExponentialMovingAverage(decay=0.9)
so,the codes will be
def batch_norm(x, phase_train, scope='bn', affine=True):
"""
Batch normalization on convolutional maps.
from: https://stackoverflow.com/questions/33949786/how-could-i-
use-batch-normalization-in-tensorflow
Only modified to infer shape from input tensor x.
Parameters
----------
x
Tensor, 4D BHWD input maps
phase_train
boolean tf.Variable, true indicates training phase
scope
string, variable scope
affine
whether to affine-transform outputs
Return
------
normed
batch-normalized maps
"""
with tf.variable_scope(scope):
shape = x.get_shape().as_list()
beta = tf.Variable(tf.constant(0.0, shape=[shape[-1]]),
name='beta', trainable=True)
gamma = tf.Variable(tf.constant(1.0, shape=[shape[-1]]),
name='gamma', trainable=affine)
batch_mean, batch_var = tf.nn.moments(x, [0, 1, 2], name='moments')
ema = tf.train.ExponentialMovingAverage(decay=0.9)
ema_apply_op = ema.apply([batch_mean, batch_var])
ema_mean, ema_var = ema.average(batch_mean), ema.average(batch_var)
def mean_var_with_update():
"""Summary
Returns
-------
name : TYPE
Description
"""
with tf.control_dependencies([ema_apply_op]):
return tf.identity(batch_mean), tf.identity(batch_var)
mean, var = tf.cond(phase_train,mean_var_with_update,lambda: (ema_mean, ema_var))
normed = tf.nn.batch_norm_with_global_normalization(x, mean, var, beta, gamma, 1e-3, affine)
return normed
comment these two lines:
from libs.batch_norm import batch_norm
from libs.activations import lrelu
and you can run code in "06_modern_convnet.ipynb"
I also encountered this problem.And I rewirted the funtion named batch_norm() in libs.batch_norm.py """Batch Normalization for TensorFlow. Parag K. Mital, Jan 2016. """
import tensorflow as tf
def batch_norm(x, phase_train, scope='bn', affine=True): """ Batch normalization on convolutional maps.
from: https://stackoverflow.com/questions/33949786/how-could-i-
use-batch-normalization-in-tensorflow
Only modified to infer shape from input tensor x.
Parameters
----------
x
Tensor, 4D BHWD input maps
phase_train
boolean tf.Variable, true indicates training phase
scope
string, variable scope
affine
whether to affine-transform outputs
Return
------
normed
batch-normalized maps
"""
with tf.variable_scope(scope):
shape = x.get_shape().as_list()
beta = tf.Variable(tf.constant(0.0, shape=[shape[-1]]),
name='beta', trainable=True)
gamma = tf.Variable(tf.constant(1.0, shape=[shape[-1]]),
name='gamma', trainable=affine)
batch_mean, batch_var = tf.nn.moments(x, [0, 1, 2], name='moments')
def mean_var_with_update():
"""Summary
Returns
-------
name : TYPE
Description
"""
ema = tf.train.ExponentialMovingAverage(decay=0.9)
ema_apply_op = ema.apply([batch_mean, batch_var])
with tf.control_dependencies([ema_apply_op]):
ema_mean, ema_var = ema.average(batch_mean), ema.average(batch_var)
return tf.identity(ema_mean), tf.identity(ema_var)
def mean_var():
return tf.identity(batch_mean), tf.identity(batch_var)
mean,var= tf.cond(phase_train,
mean_var_with_update,
mean_var)
normed = tf.nn.batch_norm_with_global_normalization(
x, mean, var, beta, gamma, 1e-3, affine)
return normed