Results 9 issues of Jingwei Zhang

Hey, * System configuration: `Ubuntu16.04`, `anaconda3`, `python3.6` * Output of `pip install vizdoom`: ``` Requirement already satisfied: vizdoom in~/anaconda3/envs/doom-env/lib/python3.6/site-packages (1.1.6) Requirement already satisfied: numpy in ~/anaconda3/envs/doom-env/lib/python3.6/site-packages (from vizdoom) (1.15.1) ```...

Hi, in line `137` of `feedforward_neural_doodle.py`, I don't quite understand why should the input masks be divided by 10 as in `inputs_batch:copy(masks_batch_/10)`? Thanks in advance! and thanks for the great...

Very helpful repo! One question, in the `forward` function in `critic.py`, there might possibly be an error: In line `37`, the `Decoder` always takes in the same initial `dec_input` for...

Hi, Super cool and helpful repo! I'm having some trouble getting it run properly, could you provide the versions of `python` and `pytorch` and other packages to correctly run this...

Hey, I've been using the lua version of this package and it's awesome, really great work! Now I'm trying out the python version, but am having this error: ``` >>>...

Hi, maybe I'm missing something obvious here, but could you inform me as why the `std` is divided again by `sqrt(len)` as this should already be taken care of in...

Hi, thanks for making the code public! I have a question regarding the function `_get_parallel_step_context`: Here, https://github.com/wouterkool/attention-learn-to-route/blob/c66da2cfdc9ae500150bfc34d597a33631d2ceb3/nets/attention_model.py#L378, `num_steps` would always be `1` as the `current_node` reads the `prev_a` of the...

Hey, In function ```get_allocation_weight()``` in ```memory.py```, ```cumprod``` is used, but as far as I know, this operation does not have autograd support yet and is still in PR: [https://github.com/pytorch/pytorch/pull/1439](url), so...

Hi, When I try to do inference with the provided `model-119496`, I get the following errors, basically all `BatchNorm` layers are missing `moving_mean`&`variance` from the checkpoint: ``` util.py:106] Missing var...