soft-decision-tree
soft-decision-tree copied to clipboard
Error when learning rate is big
python3 main.py --max-depth 4 --lr 1
...
Train Epoch: 3 [17280/60000 (29%)] Loss: 0.522095, Accuracy: 54/64 (84.0000%)
Traceback (most recent call last):
File "main.py", line 79, in
This is so stange and I can't fix it.
do you need high lr? does this bug occur when lr is relatively small? lr=1 seems like unlikely
Ok, I don't need that high lr, but this issue can happen when lr is smaller and epoch is bigger.