stanford-tensorflow-tutorials
stanford-tensorflow-tutorials copied to clipboard
This repository contains code examples for the Stanford's course: TensorFlow for Deep Learning Research.
I have trained the model for quite a long (till Iteration `1000000`), But loss seems to get stuck around `2.2` and bot response is also not satisfactory ``` HUMAN ++++...
Hi, I was going through the assignments and realised that the huber loss equation was multiplied by two here: https://github.com/chiphuyen/stanford-tensorflow-tutorials/blob/51e53daaa2a32cfe7a1966f060b28dbbd081791c/examples/04_linreg_eager.py#L43 Any reason for that?
chat result: ============================================= ============================================= HUMAN ++++ hi BOT ++++ . HUMAN ++++ good moning BOT ++++ . . . HUMAN ++++ do you have any thing to share with mei...
The more I'm training the bot ... The worse are the responses.... Check it out : - HUMAN ++++ Hi BOT ++++ dennings dennings dennings dennings dennings shed grocer grocer...
when I just use the huber loss in file "03_linreg_placeholder" in line 36, I get different results from the notes. w=-4.2249994 b=78.485054
Hi, thanks for your work. I was asking which of the improvements of the assignment https://docs.google.com/document/d/1GJfn2B6EI8JueDiBwzTAdD34d6pC99BSt6vldOmUCPQ/edit are implemented in this version. Specifically, is the model only taking as input the...
Traceback (most recent call last): File "chatbot.py", line 254, in main() File "chatbot.py", line 249, in main train() File "chatbot.py", line 127, in train test_buckets, data_buckets, train_buckets_scale = _get_buckets() File...
I used Jupyter-Notebook to download mnist dataset with following code: mnist_folder = 'data/mnist' utils.download_mnist(mnist_folder) This will cause 404 Not found issue. In utils.py: download_mnist(path): change: url = 'http://yann.lecun.com/exdb/mnist' to url...
_, loss = ########## TO DO ############ total_loss += loss Here the loss variable name will overwrite the loss function so that in next epoch there will be a error...