stanford-tensorflow-tutorials
stanford-tensorflow-tutorials copied to clipboard
This repository contains code examples for the Stanford's course: TensorFlow for Deep Learning Research.
Hi - I get the below traceback ... can you help with this one, please ? Kind regards, Jesper. --------------------------------------------------------------------------- UnicodeDecodeError Traceback (most recent call last) in () 148 149...
the training loss decreases, training accuracy is also increasing. But the test accuracy remains constant.
how to connect data base through this model
-"ecommendation"->"recommendation"
https://github.com/chiphuyen/stanford-tensorflow-tutorials/blob/51e53daaa2a32cfe7a1966f060b28dbbd081791c/examples/07_convnet_layers.py#L146 I don't understand how setting self.training = False turns training off in dropout. self.training is passed as an argument when building dropout layer. After the whole network is built,...
File "/Users/apple/tensorflow/stanford-tensorflow-tutorials/assignments/chatbot/model.py", line 51, in _inference if config.NUM_SAMPLES > 0 and config.NUM_SAMPLES < config.DEC_VOCAB: AttributeError: module 'config' has no attribute 'AttributeError: module 'config' has no attribute 'DEC_VOCAB'' I cannot file...
at line 66 ` dropout = tf.layers.dropout(fc, self.keep_prob, training=self.training, name='dropout')` where self.training is a python boolean and it seems that updating self.training won't influence dropout. Thus, since it is initially...
train(huber_loss) and train(tf.losses.huber_loss) have different resluts. it seems like ```python def huber_loss(y, y_predicted, m=1.0): """Huber loss.""" t = y - y_predicted return t ** 2 if tf.abs(t)
Hello, I tried running _tensorboard --logdir graphs/style_stranfer_ after running the style_transfer_sol.py. I couldn't visualize the graph and am getting _No dashboards are active for the current data set_ . I...
length = tf.reduce_sum(tf.reduce_max(tf.sign(seq), 2), 1) I believe this length calculation function in 11_char_rnn.py is wrong. Since the seq in the above line is the one-hot version of seq, so even...