Attentive-Neural-Process icon indicating copy to clipboard operation
Attentive-Neural-Process copied to clipboard

A Pytorch Implementation of Attentive Neural Process

Results 5 Attentive-Neural-Process issues
Sort by recently updated
recently updated
newest added

Thank you for sharing the code. According to the paper, Appendix A 2nd paragraph, dropout is not used for attention. In line 205, the residual and result are concatenated, but...

Hi everyone, There is the following code which load “checkpoint_50.pth.tar”, but I didn't find it in the project, please help to get the file. "state_dict = t.load('./checkpoint/checkpoint_50.pth.tar')"

Hi, Thanks for your implementation! I am a little confused about ` result = t.cat([residual, result], dim=-1)` in line 205 as you mentioned very important. Why do you need to...

Hi. i am very interested in your code. My problem is to know if it is possible to use it for language translation. If it is the case how?

In networ.py BCELoss has the default settings, which (looking both for pytorch1.1 and pytorch0.4) does the mean reduction. However, the KL divergence function (also in network.py) seems to be using...