TensorFlow-Summarization
TensorFlow-Summarization copied to clipboard
AttributeError: module 'tensorflow.contrib.seq2seq' has no attribute 'DynamicAttentionWrapper' AttributeError: module 'tensorflow.contrib.seq2seq' has no attribute 'DynamicAttentionWrapperState' I was getting this errors, which removed by removing the Dynamic from both DynamicAttentionWrapper and DynamicAttentionWrapperState...
I have used your pretrained model to generate summaries for giga word , duc2003 and duc2004. I would like to know what is the pretrained model trained on and how...
I tried to load the checkpoint files for further training: In train.py > `try: global_step = tf.contrib.framework.load_variable("model", "model.ckpt-300000") except Exception as err: global_step = 0` Gets called to checkpoint_utils.py >...
While testing on giga word dataset ... i faced with this error Traceback (most recent call last): File "src/summarization.py", line 241, in tf.app.run() File "C:\Users\spars\Anaconda3\envs\tensor_model\lib\site-packages\tensorflow\python\platform\app.py", line 40, in run _run(main=main,...
I tried to run your code with higher TF version (1.5) but there are something wrong with these codes: decoder_cell = tf.contrib.seq2seq.DynamicAttentionWrapper( decoder_cell, attention, state_size * 2) wrapper_state = tf.contrib.seq2seq.DynamicAttentionWrapperState(...
Following those instructions, I got an error of file not found. attachment is the screen shot of error. please help 
I know its a little late to ask about it at this point, but can anyone post the time it took to run for batches of dataset using `tensorflow` and...
I don't understand how to debug this error. 018-06-10 23:25:50.340196: I tensorflow/core/platform/cpu_feature_guard.cc:137] Your CPU supports instructions that this TensorFlow binary was not compiled to use: SSE4.1 SSE4.2 AVX Traceback (most...