GP-Tree icon indicating copy to clipboard operation
GP-Tree copied to clipboard

This link is not available, could you share it again. mini-ImageNet can be obtained at: https://drive.google.com/u/0/uc?id=0B3Irx3uQNoBMQ1FlNXJsZUdYWEE

Open tangdouer opened this issue 2 years ago • 5 comments

This link is not available, could you share it again. mini-ImageNet can be obtained at: https://drive.google.com/u/0/uc?id=0B3Irx3uQNoBMQ1FlNXJsZUdYWEE

tangdouer avatar May 12 '22 03:05 tangdouer

I am able to access the link and download the data. Can you share a screenshot of the issue?

IdanAchituve avatar May 12 '22 06:05 IdanAchituve

  1. After clicking the link, the following information will be displayed.

  2. An error has occurred. Sorry, you do not have permission to access this page.

  3. Also, I have a second problem. When training with my own training set, the loss is negative, what is the reason for this, and what should I do to solve this problem? Thank you very much for your reply.

[59 77] Training loss 0.16276: 60%|█████████████████████████████████████████████████████████████████████████████▍ | 60/100 [1:15:47<2:28:40, 223.02s/it]2022-05-12 17:11:58,609 - root - INFO - No need for training. Class: [15] 2022-05-12 17:11:58,609 - root - INFO - Training GP on classes: [15, 21] 2022-05-12 17:11:58,679 - root - INFO - output scale: 7.380000114440918 2022-05-12 17:11:58,679 - root - INFO - length scale: 0.8899999856948853 2022-05-12 17:11:58,680 - root - INFO - Loss: 7.93106, Avg. Loss: 0.012392285466194152 2022-05-12 17:11:58,693 - root - INFO - No need for training. Class: [21] 2022-05-12 17:11:58,694 - root - INFO - Training GP on classes: [10, 15, 21] 2022-05-12 17:11:58,740 - root - INFO - output scale: 6.639999866485596 2022-05-12 17:11:58,740 - root - INFO - length scale: 0.8600000143051147 2022-05-12 17:11:58,741 - root - INFO - Loss: -8.18726, Avg. Loss: -0.008528395493825277 2022-05-12 17:11:58,751 - root - INFO - No need for training. Class: [10] 2022-05-12 17:11:58,751 - root - INFO - Training GP on classes: [10, 12, 15, 21] 2022-05-12 17:11:58,796 - root - INFO - output scale: 5.170000076293945 2022-05-12 17:11:58,797 - root - INFO - length scale: 0.8600000143051147 2022-05-12 17:11:58,797 - root - INFO - Loss: 2.80030, Avg. Loss: 0.002187734842300415 2022-05-12 17:11:58,804 - root - INFO - No need for training. Class: [12] 2022-05-12 17:11:58,804 - root - INFO - Training GP on classes: [8, 10, 12, 15, 21, 30] 2022-05-12 17:11:58,852 - root - INFO - output scale: 5.599999904632568 2022-05-12 17:11:58,852 - root - INFO - length scale: 0.7900000214576721 2022-05-12 17:11:58,853 - root - INFO - Loss: -14.22661, Avg. Loss: -0.00740969181060791 2022-05-12 17:11:58,859 - root - INFO - No need for training. Class: [8] 2022-05-12 17:11:58,859 - root - INFO - Training GP on classes: [8, 30] 2022-05-12 17:11:58,896 - root - INFO - output scale: 5.25 2022-05-12 17:11:58,896 - root - INFO - length scale: 0.8799999952316284 2022-05-12 17:11:58,897 - root - INFO - Loss: 2.62175, Avg. Loss: 0.004096491634845734 2022-05-12 17:11:58,903 - root - INFO - No need for training. Class: [30] 2022-05-12 17:11:58,904 - root - INFO - Training GP on classes: [0, 4, 8, 10, 11, 12, 13, 15, 18, 21, 23, 24, 25, 26, 27, 30] 2022-05-12 17:11:58,953 - root - INFO - output scale: 5.429999828338623 2022-05-12 17:11:58,954 - root - INFO - length scale: 0.7799999713897705 2022-05-12 17:11:58,954 - root - INFO - Loss: -120.28906, Avg. Loss: -0.02349395751953125 2022-05-12 17:11:58,964 - root - INFO - No need for training. Class: [18] 2022-05-12 17:11:58,965 - root - INFO - Training GP on classes: [18, 23] 2022-05-12 17:11:59,021 - root - INFO - output scale: 6.039999961853027 2022-05-12 17:11:59,022 - root - INFO - length scale: 0.7200000286102295 2022-05-12 17:11:59,022 - root - INFO - Loss: -16.84840, Avg. Loss: -0.02632562518119812 2022-05-12 17:11:59,034 - root - INFO - No need for training. Class: [23] 2022-05-12 17:11:59,034 - root - INFO - Training GP on classes: [0, 13, 18, 23, 24, 27] 2022-05-12 17:11:59,099 - root - INFO - output scale: 5.909999847412109 2022-05-12 17:11:59,100 - root - INFO - length scale: 0.699999988079071 2022-05-12 17:11:59,100 - root - INFO - Loss: -15.96019, Avg. Loss: -0.00831260085105896 2022-05-12 17:11:59,108 - root - INFO - No need for training. Class: [13] 2022-05-12 17:11:59,108 - root - INFO - Training GP on classes: [13, 24] 2022-05-12 17:11:59,163 - root - INFO - output scale: 6.570000171661377 2022-05-12 17:11:59,163 - root - INFO - length scale: 0.6000000238418579 2022-05-12 17:11:59,164 - root - INFO - Loss: 4.70918, Avg. Loss: 0.007358099520206452 2022-05-12 17:11:59,175 - root - INFO - No need for training. Class: [24] 2022-05-12 17:11:59,175 - root - INFO - Training GP on classes: [0, 13, 24, 27] 2022-05-12 17:11:59,241 - root - INFO - output scale: 6.230000019073486 2022-05-12 17:11:59,242 - root - INFO - length scale: 0.5600000023841858 2022-05-12 17:11:59,242 - root - INFO - Loss: 6.37207, Avg. Loss: 0.004978178441524506 2022-05-12 17:11:59,254 - root - INFO - No need for training. Class: [0] 2022-05-12 17:11:59,254 - root - INFO - Training GP on classes: [0, 27] 2022-05-12 17:11:59,307 - root - INFO - output scale: 6.130000114440918 2022-05-12 17:11:59,307 - root - INFO - length scale: 0.5299999713897705 2022-05-12 17:11:59,307 - root - INFO - Loss: -1.11641, Avg. Loss: -0.0017443910241127015 2022-05-12 17:11:59,317 - root - INFO - No need for training. Class: [27] 2022-05-12 17:11:59,318 - root - INFO - Training GP on classes: [0, 4, 11, 13, 18, 23, 24, 25, 26, 27] 2022-05-12 17:11:59,366 - root - INFO - output scale: 5.869999885559082 2022-05-12 17:11:59,366 - root - INFO - length scale: 0.7599999904632568 2022-05-12 17:11:59,366 - root - INFO - Loss: -57.68216, Avg. Loss: -0.01802567481994629 2022-05-12 17:11:59,374 - root - INFO - No need for training. Class: [25] 2022-05-12 17:11:59,374 - root - INFO - Training GP on classes: [25, 26] 2022-05-12 17:11:59,434 - root - INFO - output scale: 4.550000190734863 2022-05-12 17:11:59,435 - root - INFO - length scale: 0.12999999523162842 2022-05-12 17:11:59,435 - root - INFO - Loss: 13.26464, Avg. Loss: 0.020726004242897035 2022-05-12 17:11:59,441 - root - INFO - No need for training. Class: [26] 2022-05-12 17:11:59,441 - root - INFO - Training GP on classes: [4, 25, 26] 2022-05-12 17:11:59,505 - root - INFO - output scale: 9.739999771118164 2022-05-12 17:11:59,506 - root - INFO - length scale: 0.49000000953674316 2022-05-12 17:11:59,506 - root - INFO - Loss: -92.21822, Avg. Loss: -0.09606064160664876 2022-05-12 17:11:59,517 - root - INFO - No need for training. Class: [4] 2022-05-12 17:11:59,517 - root - INFO - Training GP on classes: [4, 11, 25, 26] 2022-05-12 17:11:59,590 - root - INFO - output scale: 6.260000228881836 2022-05-12 17:11:59,591 - root - INFO - length scale: 0.6600000262260437 2022-05-12 17:11:59,591 - root - INFO - Loss: -51.39500, Avg. Loss: -0.040152347087860106 2022-05-12 17:11:59,606 - root - INFO - No need for training. Class: [11] 2022-05-12 17:11:59,606 - root - INFO - Training GP on classes: [0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30] 2022-05-12 17:11:59,728 - root - INFO - output scale: 5.96999979019165 2022-05-12 17:11:59,729 - root - INFO - length scale: 0.6399999856948853 2022-05-12 17:11:59,729 - root - INFO - Loss: -251.33502, Avg. Loss: -0.02533619173111454 2022-05-12 17:11:59,771 - root - INFO - No need for training. Class: [14] 2022-05-12 17:11:59,771 - root - INFO - Training GP on classes: [14, 19] 2022-05-12 17:11:59,843 - root - INFO - output scale: 5.760000228881836 2022-05-12 17:11:59,844 - root - INFO - length scale: 0.49000000953674316 2022-05-12 17:11:59,844 - root - INFO - Loss: 3.44124, Avg. Loss: 0.005376935005187988 2022-05-12 17:11:59,856 - root - INFO - No need for training. Class: [19] 2022-05-12 17:11:59,856 - root - INFO - Training GP on classes: [6, 14, 19] 2022-05-12 17:11:59,930 - root - INFO - output scale: 6.119999885559082 2022-05-12 17:11:59,931 - root - INFO - length scale: 0.7400000095367432 2022-05-12 17:11:59,931 - root - INFO - Loss: -40.92530, Avg. Loss: -0.042630521456400554 2022-05-12 17:11:59,945 - root - INFO - No need for training. Class: [6] 2022-05-12 17:11:59,945 - root - INFO - Training GP on classes: [6, 14, 19, 22] 2022-05-12 17:12:00,021 - root - INFO - output scale: 6.019999980926514 2022-05-12 17:12:00,022 - root - INFO - length scale: 0.8299999833106995 2022-05-12 17:12:00,022 - root - INFO - Loss: -40.71082, Avg. Loss: -0.03180532455444336 2022-05-12 17:12:00,036 - root - INFO - No need for training. Class: [22] 2022-05-12 17:12:00,036 - root - INFO - Training GP on classes: [3, 6, 7, 9, 14, 16, 19, 22, 29] 2022-05-12 17:12:00,116 - root - INFO - output scale: 6.170000076293945 2022-05-12 17:12:00,117 - root - INFO - length scale: 0.6899999976158142 2022-05-12 17:12:00,117 - root - INFO - Loss: -95.92239, Avg. Loss: -0.03330638408660889 2022-05-12 17:12:00,127 - root - INFO - No need for training. Class: [9] 2022-05-12 17:12:00,128 - root - INFO - Training GP on classes: [9, 29] 2022-05-12 17:12:00,187 - root - INFO - output scale: 6.960000038146973 2022-05-12 17:12:00,188 - root - INFO - length scale: 0.8299999833106995 2022-05-12 17:12:00,188 - root - INFO - Loss: -26.11369, Avg. Loss: -0.04080264568328858 2022-05-12 17:12:00,202 - root - INFO - No need for training. Class: [29] 2022-05-12 17:12:00,203 - root - INFO - Training GP on classes: [3, 7, 9, 16, 29] 2022-05-12 17:12:00,302 - root - INFO - output scale: 6.360000133514404 2022-05-12 17:12:00,303 - root - INFO - length scale: 0.7400000095367432 2022-05-12 17:12:00,303 - root - INFO - Loss: -53.23288, Avg. Loss: -0.03327054977416992 2022-05-12 17:12:00,314 - root - INFO - No need for training. Class: [7] 2022-05-12 17:12:00,315 - root - INFO - Training GP on classes: [7, 16] 2022-05-12 17:12:00,456 - root - INFO - output scale: 7.110000133514404 2022-05-12 17:12:00,457 - root - INFO - length scale: 0.4699999988079071 2022-05-12 17:12:00,457 - root - INFO - Loss: -8.87638, Avg. Loss: -0.013869342207908631 2022-05-12 17:12:00,465 - root - INFO - No need for training. Class: [16] 2022-05-12 17:12:00,466 - root - INFO - Training GP on classes: [3, 7, 16] 2022-05-12 17:12:00,512 - root - INFO - output scale: 6.139999866485596 2022-05-12 17:12:00,512 - root - INFO - length scale: 0.5699999928474426 2022-05-12 17:12:00,512 - root - INFO - Loss: -28.06865, Avg. Loss: -0.02923818031946818 2022-05-12 17:12:00,523 - root - INFO - No need for training. Class: [3] 2022-05-12 17:12:00,523 - root - INFO - Training GP on classes: [1, 2, 3, 5, 6, 7, 9, 14, 16, 17, 19, 20, 22, 28, 29] 2022-05-12 17:12:00,567 - root - INFO - output scale: 5.860000133514404 2022-05-12 17:12:00,568 - root - INFO - length scale: 0.8100000023841858 2022-05-12 17:12:00,569 - root - INFO - Loss: -125.10228, Avg. Loss: -0.02606297492980957 2022-05-12 17:12:00,586 - root - INFO - No need for training. Class: [5] 2022-05-12 17:12:00,586 - root - INFO - Training GP on classes: [5, 28] 2022-05-12 17:12:00,641 - root - INFO - output scale: 5.190000057220459 2022-05-12 17:12:00,644 - root - INFO - length scale: 0.33000001311302185 2022-05-12 17:12:00,644 - root - INFO - Loss: -2.99954, Avg. Loss: -0.004686780273914337 2022-05-12 17:12:00,655 - root - INFO - No need for training. Class: [28] 2022-05-12 17:12:00,655 - root - INFO - Training GP on classes: [5, 20, 28] 2022-05-12 17:12:00,713 - root - INFO - output scale: 5.960000038146973 2022-05-12 17:12:00,713 - root - INFO - length scale: 0.699999988079071 2022-05-12 17:12:00,714 - root - INFO - Loss: -16.12902, Avg. Loss: -0.016801059246063232 2022-05-12 17:12:00,722 - root - INFO - No need for training. Class: [20] 2022-05-12 17:12:00,723 - root - INFO - Training GP on classes: [1, 2, 5, 17, 20, 28] 2022-05-12 17:12:00,776 - root - INFO - output scale: 6.099999904632568 2022-05-12 17:12:00,776 - root - INFO - length scale: 0.8899999856948853 2022-05-12 17:12:00,777 - root - INFO - Loss: -33.17054, Avg. Loss: -0.017276320854822794 2022-05-12 17:12:00,786 - root - INFO - No need for training. Class: [17] 2022-05-12 17:12:00,787 - root - INFO - Training GP on classes: [1, 2, 17] 2022-05-12 17:12:00,839 - root - INFO - output scale: 6.309999942779541 2022-05-12 17:12:00,839 - root - INFO - length scale: 0.8799999952316284 2022-05-12 17:12:00,840 - root - INFO - Loss: -19.97122, Avg. Loss: -0.02080335219701131 2022-05-12 17:12:00,848 - root - INFO - No need for training. Class: [1] 2022-05-12 17:12:00,848 - root - INFO - Training GP on classes: [1, 2] 2022-05-12 17:12:00,896 - root - INFO - output scale: 6.510000228881836 2022-05-12 17:12:00,897 - root - INFO - length scale: 0.6899999976158142 2022-05-12 17:12:00,897 - root - INFO - Loss: 12.88463, Avg. Loss: 0.020132231712341308 2022-05-12 17:12:00,908 - root - INFO - No need for training. Class: [2]

tangdouer avatar May 12 '22 09:05 tangdouer

Hi, Thanks for noticing. I am sorry but, I can't help with issues regarding that since I am not affiliated to that dataset by any means. If you find other download sources for it, the index files should still be relevant and the code should work as expected.

Regarding the negative loss function, that is ok. There is nothing to solve. As training progress the likelihood increases and as a result the negative log likelihood is expected to be negative.

IdanAchituve avatar May 12 '22 12:05 IdanAchituve

Hi, Thank you very much for your reply. I have a third question and need to bother you again. For the cub dataset, there are a total of 1 base session and 9 few-shot sessions.

For the test accuracy of 9 few-shot sessions, it can be obtained by the log_metrics function. def log_metrics(epoch, session, cumm_loss, test_loss, test_accuracies):

The test accuracy for the base session is obtained by the base_logging function? Is the test accuracy of the last epoch the test result of the base session? def base_logging(epoch, cumm_loss, val_loss, val_accuracies, test_loss, test_accuracies):

Thanks again.

tangdouer avatar May 12 '22 13:05 tangdouer

For cub there is 1 base session and 10 few-shot sessions. Indeed the test accuracy for the base session is the one at the last epoch of the base training and is logged to screen via the base_logging function.

IdanAchituve avatar May 12 '22 14:05 IdanAchituve