foreverhell

Results 3 issues of foreverhell

I have seen the supported OHEM, and I have found the OHEM in MMDetection `train_cfg=dict(rcnn=dict(sampler=dict(type='OHEMSampler'))))`, but when I use it in MMClassification, I encountered a problem 'FileNotFoundError: [Errno 2] No...

help wanted

![dpo](https://github.com/user-attachments/assets/7b4f4988-f403-4129-bfb4-7180a91eb94e) How to gurantee the two the same? When I train a custom LLM in DPO, the loss cannot divergence. Is the reason for the two are different?