pycox
pycox copied to clipboard
no loss for training with small batch size,
When I train the Deepsurv model with small batch size (64), there is no loss and always early stop
However, if I train with larger batch size (1024), it almost always training properly
May I ask what might be the possible reason ? I am not familiar with the progress bar
That is very strange. I would always think a loss should be produced. Can you post a full example that reproduce the issue?
That is very strange. I would always think a loss should be produced. Can you post a full example that reproduce the issue?
thanks for the quick response, I have identified the issue, which is caused by having zero positive event when batch size is small. Therefore, dividing the number of positive events can explode the loss. I have added a small number to the denominator to avoid this in my local branch
@yikuanli can you share your solution here?
Hi @yikuanli , I also want to know.