tensorflow_tutorials icon indicating copy to clipboard operation
tensorflow_tutorials copied to clipboard

AttributeError: 'NoneType' object has no attribute 'name'

Open zhangqianhui opened this issue 9 years ago • 7 comments

when i run python 06_modern_convnet.py , I got this error. I don't understand what happen and I just test batch normalization layer. Please help me. Thanks!

zhangqianhui avatar Nov 27 '16 02:11 zhangqianhui

Errors appear in

 File "/home/jaychao/code/tensorflow_tutorials/python/libs/batch_norm.py", line 58, in batch_norm
    lambda: (ema_mean, ema_var))

zhangqianhui avatar Nov 27 '16 03:11 zhangqianhui

I think the reason is the difference of version!

zhangqianhui avatar Nov 27 '16 10:11 zhangqianhui

Got the same thing! Anyone knows how to port over? Tried the official migration tool on the libs folder with no luck

streamliner18 avatar Mar 15 '17 18:03 streamliner18

I am also facing the same error. Please help to resolve the issue.

brijmohan avatar Mar 21 '17 15:03 brijmohan

Hi, I also encountered this problem, I solved it by moving the line 53 ema_apply_op = ema.apply([batch_mean, batch_var]) to make it after line 42 ema = tf.train.ExponentialMovingAverage(decay=0.9) so,the codes will be

def batch_norm(x, phase_train, scope='bn', affine=True):
    """
    Batch normalization on convolutional maps.

    from: https://stackoverflow.com/questions/33949786/how-could-i-
    use-batch-normalization-in-tensorflow

    Only modified to infer shape from input tensor x.

    Parameters
    ----------
    x
        Tensor, 4D BHWD input maps
    phase_train
        boolean tf.Variable, true indicates training phase
    scope
        string, variable scope
    affine
        whether to affine-transform outputs

    Return
    ------
    normed
        batch-normalized maps
    """
    with tf.variable_scope(scope):
        shape = x.get_shape().as_list()

        beta = tf.Variable(tf.constant(0.0, shape=[shape[-1]]),
                           name='beta', trainable=True)
        gamma = tf.Variable(tf.constant(1.0, shape=[shape[-1]]),
                            name='gamma', trainable=affine)
        batch_mean, batch_var = tf.nn.moments(x, [0, 1, 2], name='moments')
        ema = tf.train.ExponentialMovingAverage(decay=0.9)
        ema_apply_op = ema.apply([batch_mean, batch_var])
        ema_mean, ema_var = ema.average(batch_mean), ema.average(batch_var)


        def mean_var_with_update():
            """Summary

            Returns
            -------
            name : TYPE
                Description
            """
            with tf.control_dependencies([ema_apply_op]):
                return tf.identity(batch_mean), tf.identity(batch_var)
        mean, var = tf.cond(phase_train,mean_var_with_update,lambda: (ema_mean, ema_var))
        normed = tf.nn.batch_norm_with_global_normalization(x, mean, var, beta, gamma, 1e-3, affine)
    return normed

styleNie avatar Mar 23 '17 10:03 styleNie

comment these two lines:
from libs.batch_norm import batch_norm
from libs.activations import lrelu

and you can run code in "06_modern_convnet.ipynb"

nickyi1990 avatar Dec 19 '17 06:12 nickyi1990

I also encountered this problem.And I rewirted the funtion named batch_norm() in libs.batch_norm.py """Batch Normalization for TensorFlow. Parag K. Mital, Jan 2016. """

import tensorflow as tf

def batch_norm(x, phase_train, scope='bn', affine=True): """ Batch normalization on convolutional maps.

from: https://stackoverflow.com/questions/33949786/how-could-i-
use-batch-normalization-in-tensorflow

Only modified to infer shape from input tensor x.

Parameters
----------
x
    Tensor, 4D BHWD input maps
phase_train
    boolean tf.Variable, true indicates training phase
scope
    string, variable scope
affine
    whether to affine-transform outputs

Return
------
normed
    batch-normalized maps
"""
with tf.variable_scope(scope):
    shape = x.get_shape().as_list()

    beta = tf.Variable(tf.constant(0.0, shape=[shape[-1]]),
                       name='beta', trainable=True)
    gamma = tf.Variable(tf.constant(1.0, shape=[shape[-1]]),
                        name='gamma', trainable=affine)

    batch_mean, batch_var = tf.nn.moments(x, [0, 1, 2], name='moments')
   
    

    def mean_var_with_update():
        """Summary

        Returns
        -------
        name : TYPE
            Description
        """
        ema = tf.train.ExponentialMovingAverage(decay=0.9)
        ema_apply_op = ema.apply([batch_mean, batch_var])
        with tf.control_dependencies([ema_apply_op]):
            ema_mean, ema_var = ema.average(batch_mean), ema.average(batch_var)
            return tf.identity(ema_mean), tf.identity(ema_var)
    def mean_var():
          return  tf.identity(batch_mean), tf.identity(batch_var)
    mean,var= tf.cond(phase_train,
                      mean_var_with_update,
                           mean_var)
  
    normed = tf.nn.batch_norm_with_global_normalization(
        x, mean, var, beta, gamma, 1e-3, affine)
return normed

OucAndrewLee avatar Jan 23 '18 11:01 OucAndrewLee