finetune_alexnet_with_tensorflow
finetune_alexnet_with_tensorflow copied to clipboard
ValueError: The initial value's shape (()) is not compatible with the explicitly supplied `shape` argument ([11, 11, 3, 96]).
I run the script of finetune.py in both TensorFlow 1.5 and TensorFlow 2.1. After solving many issues, I found out the shape incompatibility issue in the script of alexnet.py. Please help fix issue at your convenience. Appreciate your help in advance.
I get to know that there is the scope conflict between the shape and the conv argument. tf.variable_scope() usually defines global variables in the with context. It influences other related variables. For instance, shape= [filter_height, filter_width, input_channels//groups, num_filters], it denotes [11,11,3,96] in the Conv1; in contrast, Conv1 includes the arguments: 11, 11, 96, 4, 4.
1. Error Message
ValueError: The initial value's shape (()) is not compatible with the explicitly supplied shape
argument ([11, 11, 3, 96]).
2. Attempted Changes
I tried to make the following changes.
1). Change the order of either the shape or the Conv1 arguments, for instance, shape=[11,11,96,3] or conv(self.X, 11, 11, 4, 4, 96, name='conv1', padding='VALID')
2). Change the name of shape
kernel_shape= [filter_height, filter_width, input_channels//groups, num_filters]
3). Delete the shape and keep the argument.
[filter_height, filter_width, input_channels//groups, num_filters]
However, the following error varieties have still been persisted.
ValueError: The initial value's shape (()) is not compatible with the explicitly supplied shape
argument ([11, 11, 3, 4]).
ValueError: Shapes must be equal rank, but are 4 and 0 for 'conv1/Variable/Assign' (op: 'Assign') with input shapes: [11,11,3,4], [].
ValueError: Shapes must be equal rank, but are 4 and 0 for 'conv1/Variable/Assign' (op: 'Assign') with input shapes: [11,11,3,96], [].
It is definitely the critical issue of "shape" in the second snippet. But I have not yet figured a way to solve the issues.
3. Snippets
1st snippet.
class AlexNet(object):
.........
def create(self):
"""Create the network graph."""
# 1st Layer: Conv (w ReLu) -> Lrn -> Pool
conv1 = conv(self.X, 11, 11, 4, 4, 96, name='conv1', padding='VALID')
norm1 = lrn(conv1, 2, 2e-05, 0.75, name='norm1')
pool1 = max_pool(norm1, 3, 3, 2, 2, name='pool1', padding='VALID')
2nd snippet:
def conv(x, filter_height, filter_width, stride_y, stride_x, num_filters, name,
padding='SAME', groups=1):
.........
with tf.compat.v1.variable_scope(name) as scope:
weights = tf.Variable('weights', shape=[filter_height,
filter_width,
input_channels//groups,
num_filters])
biases = tf.Variable('biases', shape=[num_filters])
4. Detailed error message
$ python finetune.py
Traceback (most recent call last):
File "finetune.py", line 91, in shape
argument ([11, 11, 3, 96]).