deep-learning-models icon indicating copy to clipboard operation
deep-learning-models copied to clipboard

How do customize network's operation

Open ymcasky opened this issue 7 years ago • 0 comments

Dear all,

My module is :

model = Sequential()

model.add(InputLayer(input_shape=(img_size_flat,)))

#784->(28,28,1) model.add(Reshape(img_shape_full))

model.add(Conv2D(kernel_size=5, strides=1, filters=16, padding='same', activation='relu', name='layer_conv1'))

model.add(MaxPooling2D(pool_size=2, strides=2))

model.add(Conv2D(kernel_size=5, strides=1, filters=36, padding='same', activation='relu', name='layer_conv2')) model.add(MaxPooling2D(pool_size=2, strides=2))

model.add(Flatten())

model.add(Dense(128, activation='relu'))

model.add(Dense(num_classes, activation='softmax'))


I have one question. Can I get the fc1 layers of keras and do the rest operation on tensorflow? Because I don't know how to customize some operation like L2-norm for fc2's weight on keras. But tf can do it like 🔢

def customize_fc2(fc1_input, Label, num_cls, name='customize_fc2')

w = tf.get_variable("customized/W", [xs[1], num_cls], dtype=tf.float32, initializer=tf.contrib.layers.xavier_initializer())

w = tf.nn.l2_normalize(w, dim = 0)

logits = tf.matmul(fc1_input, w)

loss = tf.reduce_mean(tf.nn.sparse_softmax_cross_entropy_with_logits(labels=Label, logits=updated_logits))

return loss

Thanks for reply!!

ymcasky avatar Jan 31 '18 04:01 ymcasky