ACOS icon indicating copy to clipboard operation
ACOS copied to clipboard

Issues for run_step2.py

Open shenjing023 opened this issue 3 years ago • 6 comments

当训练到步骤二的时候,Eval阶段的输出都是0

12/15/2021 19:36:24 - INFO - __main__ -   ***** Running training *****
Epoch:   0%|                                                                                                                                                        | 0/1 [00:00<?, ?it/s]12/15/2021 19:36:32 - INFO - __main__ -   Total Loss is 0.707695484161377 .
12/15/2021 19:36:33 - INFO - __main__ -   Total Loss is 0.17088936269283295 .
12/15/2021 19:36:35 - INFO - __main__ -   Total Loss is 0.11365362256765366 .
12/15/2021 19:36:36 - INFO - __main__ -   Total Loss is 0.09475410729646683 .
12/15/2021 19:36:38 - INFO - __main__ -   Total Loss is 0.10162457078695297 .
12/15/2021 19:36:39 - INFO - __main__ -   Total Loss is 0.0938352420926094 .
12/15/2021 19:36:40 - INFO - __main__ -   Total Loss is 0.10106469690799713 .
12/15/2021 19:36:42 - INFO - __main__ -   Total Loss is 0.1062822937965393 .
12/15/2021 19:36:43 - INFO - __main__ -   Total Loss is 0.10448531806468964 .
12/15/2021 19:36:45 - INFO - __main__ -   Total Loss is 0.09545283764600754 .
12/15/2021 19:36:46 - INFO - __main__ -   Total Loss is 0.08869557082653046 .
12/15/2021 19:36:48 - INFO - __main__ -   Total Loss is 0.09898055344820023 .
12/15/2021 19:36:49 - INFO - __main__ -   Total Loss is 0.096546471118927 .
12/15/2021 19:36:51 - INFO - __main__ -   Total Loss is 0.09676332026720047 .
12/15/2021 19:36:52 - INFO - __main__ -   Total Loss is 0.09095398336648941 .
12/15/2021 19:36:54 - INFO - __main__ -   Total Loss is 0.095858633518219 .
Quad num: 0
tp: 0.0. fp: 0.0. fn: 251.0.
12/15/2021 19:36:54 - INFO - __main__ -   ***** Eval results *****
12/15/2021 19:36:54 - INFO - __main__ -     micro-F1 = 0
12/15/2021 19:36:54 - INFO - __main__ -     precision = 0
12/15/2021 19:36:54 - INFO - __main__ -     recall = 0.0
Quad num: 0
tp: 0.0. fp: 0.0. fn: 895.0.
tp: 0.0. fp: 0.0. fn: 490.0.
0 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 142.0.
1 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 98.0.
2 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 102.0.
3 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 715.0.
4 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
12/15/2021 19:37:00 - INFO - __main__ -   ***** category results *****
12/15/2021 19:37:00 - INFO - __main__ -     micro-F1 = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -     precision = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -     recall = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -   -----------------------------------
tp: 0.0. fp: 0.0. fn: 399.0.
0 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 123.0.
1 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 85.0.
2 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 95.0.
3 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 623.0.
4 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
12/15/2021 19:37:00 - INFO - __main__ -   ***** sentiment results *****
12/15/2021 19:37:00 - INFO - __main__ -     micro-F1 = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -     precision = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -     recall = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -   -----------------------------------
tp: 0.0. fp: 0.0. fn: 497.0.
0 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 144.0.
1 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 101.0.
2 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 103.0.
3 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 725.0.
4 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
12/15/2021 19:37:00 - INFO - __main__ -   ***** category sentiment results *****
12/15/2021 19:37:00 - INFO - __main__ -     micro-F1 = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -     precision = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -     recall = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -   -----------------------------------
tp: 0.0. fp: 0.0. fn: 580.0.
0 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 153.0.
1 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 139.0.
2 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 98.0.
3 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 804.0.
4 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
12/15/2021 19:37:00 - INFO - __main__ -   ***** aspect results *****
12/15/2021 19:37:00 - INFO - __main__ -     micro-F1 = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -     precision = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -     recall = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -   -----------------------------------
tp: 0.0. fp: 0.0. fn: 596.0.
0 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 160.0.
1 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 144.0.
2 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 103.0.
3 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 827.0.
4 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
12/15/2021 19:37:00 - INFO - __main__ -   ***** category aspect results *****
12/15/2021 19:37:00 - INFO - __main__ -     micro-F1 = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -     precision = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -     recall = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -   -----------------------------------
tp: 0.0. fp: 0.0. fn: 589.0.
0 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 158.0.
1 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 141.0.
2 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 101.0.
3 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 818.0.
4 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
12/15/2021 19:37:00 - INFO - __main__ -   ***** sentiment aspect results *****
12/15/2021 19:37:00 - INFO - __main__ -     micro-F1 = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -     precision = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -     recall = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -   -----------------------------------
tp: 0.0. fp: 0.0. fn: 600.0.
0 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 161.0.
1 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 145.0.
2 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 103.0.
3 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 832.0.
4 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
12/15/2021 19:37:00 - INFO - __main__ -   ***** category sentiment aspect results *****
12/15/2021 19:37:00 - INFO - __main__ -     micro-F1 = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -     precision = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -     recall = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -   -----------------------------------
tp: 0.0. fp: 0.0. fn: 580.0.
0 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 150.0.
1 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 113.0.
2 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 102.0.
3 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 811.0.
4 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
12/15/2021 19:37:00 - INFO - __main__ -   ***** opinion results *****
12/15/2021 19:37:00 - INFO - __main__ -     micro-F1 = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -     precision = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -     recall = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -   -----------------------------------
tp: 0.0. fp: 0.0. fn: 595.0.
0 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 154.0.
1 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 116.0.
2 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 103.0.
3 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 827.0.
4 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
12/15/2021 19:37:00 - INFO - __main__ -   ***** category opinion results *****
12/15/2021 19:37:00 - INFO - __main__ -     micro-F1 = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -     precision = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -     recall = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -   -----------------------------------
tp: 0.0. fp: 0.0. fn: 580.0.
0 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 150.0.
1 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 114.0.
2 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 103.0.
3 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 812.0.
4 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
12/15/2021 19:37:00 - INFO - __main__ -   ***** sentiment opinion results *****
12/15/2021 19:37:00 - INFO - __main__ -     micro-F1 = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -     precision = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -     recall = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -   -----------------------------------
tp: 0.0. fp: 0.0. fn: 595.0.
0 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 154.0.
1 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 117.0.
2 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 104.0.
3 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 828.0.
4 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
12/15/2021 19:37:00 - INFO - __main__ -   ***** category sentiment opinion results *****
12/15/2021 19:37:00 - INFO - __main__ -     micro-F1 = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -     precision = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -     recall = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -   -----------------------------------
tp: 0.0. fp: 0.0. fn: 659.0.
0 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 167.0.
1 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 147.0.
2 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 104.0.
3 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 895.0.
4 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
12/15/2021 19:37:00 - INFO - __main__ -   ***** aspect opinion results *****
12/15/2021 19:37:00 - INFO - __main__ -     micro-F1 = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -     precision = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -     recall = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -   -----------------------------------
tp: 0.0. fp: 0.0. fn: 659.0.
0 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 167.0.
1 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 147.0.
2 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 104.0.
3 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 895.0.
4 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
12/15/2021 19:37:00 - INFO - __main__ -   ***** category aspect opinion results *****
12/15/2021 19:37:00 - INFO - __main__ -     micro-F1 = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -     precision = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -     recall = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -   -----------------------------------
tp: 0.0. fp: 0.0. fn: 659.0.
0 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 167.0.
1 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 147.0.
2 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 104.0.
3 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 895.0.
4 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
12/15/2021 19:37:00 - INFO - __main__ -   ***** sentiment aspect opinion results *****
12/15/2021 19:37:00 - INFO - __main__ -     micro-F1 = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -     precision = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -     recall = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -   -----------------------------------
tp: 0.0. fp: 0.0. fn: 659.0.
0 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 167.0.
1 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 147.0.
2 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 104.0.
3 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 895.0.
4 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
12/15/2021 19:37:00 - INFO - __main__ -   ***** category sentiment aspect opinion results *****
12/15/2021 19:37:00 - INFO - __main__ -     micro-F1 = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -     precision = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -     recall = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -   -----------------------------------
12/15/2021 19:37:00 - INFO - __main__ -   ***** Test results *****
12/15/2021 19:37:00 - INFO - __main__ -     micro-F1 = 0
12/15/2021 19:37:00 - INFO - __main__ -     precision = 0
12/15/2021 19:37:00 - INFO - __main__ -     recall = 0.0
Epoch: 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 1/1 [00:28<00:00, 28.83s/it]
12/15/2021 19:37:00 - INFO - __main__ -   ***** Test results *****
12/15/2021 19:37:00 - INFO - __main__ -     micro-F1 = 0
12/15/2021 19:37:00 - INFO - __main__ -     precision = 0
12/15/2021 19:37:00 - INFO - __main__ -     recall = 0.0

shenjing023 avatar Dec 15 '21 11:12 shenjing023

我也是遇到了这个问题,请问你解决了吗?

后面改一下epochs的次数就好了

reidli avatar Dec 25 '21 07:12 reidli

+1

wxybdth avatar Feb 17 '22 13:02 wxybdth

当训练到步骤二的时候,Eval阶段的输出都是0

12/15/2021 19:36:24 - INFO - __main__ -   ***** Running training *****
Epoch:   0%|                                                                                                                                                        | 0/1 [00:00<?, ?it/s]12/15/2021 19:36:32 - INFO - __main__ -   Total Loss is 0.707695484161377 .
12/15/2021 19:36:33 - INFO - __main__ -   Total Loss is 0.17088936269283295 .
12/15/2021 19:36:35 - INFO - __main__ -   Total Loss is 0.11365362256765366 .
12/15/2021 19:36:36 - INFO - __main__ -   Total Loss is 0.09475410729646683 .
12/15/2021 19:36:38 - INFO - __main__ -   Total Loss is 0.10162457078695297 .
12/15/2021 19:36:39 - INFO - __main__ -   Total Loss is 0.0938352420926094 .
12/15/2021 19:36:40 - INFO - __main__ -   Total Loss is 0.10106469690799713 .
12/15/2021 19:36:42 - INFO - __main__ -   Total Loss is 0.1062822937965393 .
12/15/2021 19:36:43 - INFO - __main__ -   Total Loss is 0.10448531806468964 .
12/15/2021 19:36:45 - INFO - __main__ -   Total Loss is 0.09545283764600754 .
12/15/2021 19:36:46 - INFO - __main__ -   Total Loss is 0.08869557082653046 .
12/15/2021 19:36:48 - INFO - __main__ -   Total Loss is 0.09898055344820023 .
12/15/2021 19:36:49 - INFO - __main__ -   Total Loss is 0.096546471118927 .
12/15/2021 19:36:51 - INFO - __main__ -   Total Loss is 0.09676332026720047 .
12/15/2021 19:36:52 - INFO - __main__ -   Total Loss is 0.09095398336648941 .
12/15/2021 19:36:54 - INFO - __main__ -   Total Loss is 0.095858633518219 .
Quad num: 0
tp: 0.0. fp: 0.0. fn: 251.0.
12/15/2021 19:36:54 - INFO - __main__ -   ***** Eval results *****
12/15/2021 19:36:54 - INFO - __main__ -     micro-F1 = 0
12/15/2021 19:36:54 - INFO - __main__ -     precision = 0
12/15/2021 19:36:54 - INFO - __main__ -     recall = 0.0
Quad num: 0
tp: 0.0. fp: 0.0. fn: 895.0.
tp: 0.0. fp: 0.0. fn: 490.0.
0 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 142.0.
1 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 98.0.
2 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 102.0.
3 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 715.0.
4 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
12/15/2021 19:37:00 - INFO - __main__ -   ***** category results *****
12/15/2021 19:37:00 - INFO - __main__ -     micro-F1 = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -     precision = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -     recall = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -   -----------------------------------
tp: 0.0. fp: 0.0. fn: 399.0.
0 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 123.0.
1 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 85.0.
2 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 95.0.
3 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 623.0.
4 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
12/15/2021 19:37:00 - INFO - __main__ -   ***** sentiment results *****
12/15/2021 19:37:00 - INFO - __main__ -     micro-F1 = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -     precision = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -     recall = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -   -----------------------------------
tp: 0.0. fp: 0.0. fn: 497.0.
0 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 144.0.
1 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 101.0.
2 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 103.0.
3 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 725.0.
4 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
12/15/2021 19:37:00 - INFO - __main__ -   ***** category sentiment results *****
12/15/2021 19:37:00 - INFO - __main__ -     micro-F1 = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -     precision = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -     recall = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -   -----------------------------------
tp: 0.0. fp: 0.0. fn: 580.0.
0 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 153.0.
1 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 139.0.
2 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 98.0.
3 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 804.0.
4 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
12/15/2021 19:37:00 - INFO - __main__ -   ***** aspect results *****
12/15/2021 19:37:00 - INFO - __main__ -     micro-F1 = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -     precision = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -     recall = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -   -----------------------------------
tp: 0.0. fp: 0.0. fn: 596.0.
0 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 160.0.
1 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 144.0.
2 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 103.0.
3 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 827.0.
4 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
12/15/2021 19:37:00 - INFO - __main__ -   ***** category aspect results *****
12/15/2021 19:37:00 - INFO - __main__ -     micro-F1 = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -     precision = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -     recall = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -   -----------------------------------
tp: 0.0. fp: 0.0. fn: 589.0.
0 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 158.0.
1 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 141.0.
2 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 101.0.
3 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 818.0.
4 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
12/15/2021 19:37:00 - INFO - __main__ -   ***** sentiment aspect results *****
12/15/2021 19:37:00 - INFO - __main__ -     micro-F1 = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -     precision = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -     recall = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -   -----------------------------------
tp: 0.0. fp: 0.0. fn: 600.0.
0 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 161.0.
1 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 145.0.
2 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 103.0.
3 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 832.0.
4 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
12/15/2021 19:37:00 - INFO - __main__ -   ***** category sentiment aspect results *****
12/15/2021 19:37:00 - INFO - __main__ -     micro-F1 = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -     precision = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -     recall = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -   -----------------------------------
tp: 0.0. fp: 0.0. fn: 580.0.
0 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 150.0.
1 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 113.0.
2 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 102.0.
3 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 811.0.
4 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
12/15/2021 19:37:00 - INFO - __main__ -   ***** opinion results *****
12/15/2021 19:37:00 - INFO - __main__ -     micro-F1 = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -     precision = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -     recall = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -   -----------------------------------
tp: 0.0. fp: 0.0. fn: 595.0.
0 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 154.0.
1 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 116.0.
2 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 103.0.
3 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 827.0.
4 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
12/15/2021 19:37:00 - INFO - __main__ -   ***** category opinion results *****
12/15/2021 19:37:00 - INFO - __main__ -     micro-F1 = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -     precision = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -     recall = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -   -----------------------------------
tp: 0.0. fp: 0.0. fn: 580.0.
0 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 150.0.
1 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 114.0.
2 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 103.0.
3 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 812.0.
4 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
12/15/2021 19:37:00 - INFO - __main__ -   ***** sentiment opinion results *****
12/15/2021 19:37:00 - INFO - __main__ -     micro-F1 = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -     precision = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -     recall = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -   -----------------------------------
tp: 0.0. fp: 0.0. fn: 595.0.
0 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 154.0.
1 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 117.0.
2 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 104.0.
3 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 828.0.
4 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
12/15/2021 19:37:00 - INFO - __main__ -   ***** category sentiment opinion results *****
12/15/2021 19:37:00 - INFO - __main__ -     micro-F1 = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -     precision = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -     recall = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -   -----------------------------------
tp: 0.0. fp: 0.0. fn: 659.0.
0 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 167.0.
1 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 147.0.
2 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 104.0.
3 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 895.0.
4 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
12/15/2021 19:37:00 - INFO - __main__ -   ***** aspect opinion results *****
12/15/2021 19:37:00 - INFO - __main__ -     micro-F1 = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -     precision = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -     recall = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -   -----------------------------------
tp: 0.0. fp: 0.0. fn: 659.0.
0 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 167.0.
1 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 147.0.
2 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 104.0.
3 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 895.0.
4 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
12/15/2021 19:37:00 - INFO - __main__ -   ***** category aspect opinion results *****
12/15/2021 19:37:00 - INFO - __main__ -     micro-F1 = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -     precision = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -     recall = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -   -----------------------------------
tp: 0.0. fp: 0.0. fn: 659.0.
0 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 167.0.
1 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 147.0.
2 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 104.0.
3 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 895.0.
4 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
12/15/2021 19:37:00 - INFO - __main__ -   ***** sentiment aspect opinion results *****
12/15/2021 19:37:00 - INFO - __main__ -     micro-F1 = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -     precision = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -     recall = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -   -----------------------------------
tp: 0.0. fp: 0.0. fn: 659.0.
0 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 167.0.
1 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 147.0.
2 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 104.0.
3 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
tp: 0.0. fp: 0.0. fn: 895.0.
4 :  {'precision': 0, 'recall': 0.0, 'micro-F1': 0}
12/15/2021 19:37:00 - INFO - __main__ -   ***** category sentiment aspect opinion results *****
12/15/2021 19:37:00 - INFO - __main__ -     micro-F1 = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -     precision = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -     recall = 0.00%
12/15/2021 19:37:00 - INFO - __main__ -   -----------------------------------
12/15/2021 19:37:00 - INFO - __main__ -   ***** Test results *****
12/15/2021 19:37:00 - INFO - __main__ -     micro-F1 = 0
12/15/2021 19:37:00 - INFO - __main__ -     precision = 0
12/15/2021 19:37:00 - INFO - __main__ -     recall = 0.0
Epoch: 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 1/1 [00:28<00:00, 28.83s/it]
12/15/2021 19:37:00 - INFO - __main__ -   ***** Test results *****
12/15/2021 19:37:00 - INFO - __main__ -     micro-F1 = 0
12/15/2021 19:37:00 - INFO - __main__ -     precision = 0
12/15/2021 19:37:00 - INFO - __main__ -     recall = 0.0

是不是处理出一阶段pair的时候,路径设置的有问题? https://github.com/NUSTM/ACOS/blob/09fa3eea7ea11a915796673c7571be80b6c20a28/Extract-Classify-ACOS/tokenized_data/get_1st_pairs.py#L10-L16

blhoy avatar Feb 18 '22 15:02 blhoy

Did you try to increase num of epochs? I tried and it worked!

Also, I have one additional question.. what is the number of epochs used in run_step1.py and run_step2.py? I can't reproduce results from article.

fikadata avatar Mar 01 '22 09:03 fikadata

Did you try to increase num of epochs? I tried and it worked!

Also, I have one additional question.. what is the number of epochs used in run_step1.py and run_step2.py? I can't reproduce results from article.

The number of training epochs is 30.

blhoy avatar Apr 07 '22 09:04 blhoy

你好,我在复现论文的时候,在运行代码时出现了问题: 在run_step2.py的第257行:para_optimizer = list(model.named_parameters())出现了以下错误: AttributeError: 'NoneType' obeject has no attribute 'named_parameters 同时,我下载了bert预训练模型,但是,出现了错误,显示Model name ‘下载的预训练模型路径’ was not found in model name list. We assumed '下载的预训练模型路径/pytorch_model.bin' was a path or url but couldn't find any file associated to this path or url 方便告知一下如何解决吗

windtricker avatar Apr 14 '22 08:04 windtricker