foreverhell
Results
3
issues of
foreverhell
I have seen the supported OHEM, and I have found the OHEM in MMDetection `train_cfg=dict(rcnn=dict(sampler=dict(type='OHEMSampler'))))`, but when I use it in MMClassification, I encountered a problem 'FileNotFoundError: [Errno 2] No...
help wanted
 How to gurantee the two the same? When I train a custom LLM in DPO, the loss cannot divergence. Is the reason for the two are different?