Medical-SAM2 icon indicating copy to clipboard operation
Medical-SAM2 copied to clipboard

Questions about *checkpoints* and *pretrain weights*

Open yaotingwangofficial opened this issue 11 months ago • 3 comments

Dear Authours,

Many thanks for your efforts on this great work and releasing the code!

I have some questions that (I guess others may also be interested in):

Q1. how to load the pretrain weights correctly?

(1) I got the pretrain weight from: https://huggingface.co/jiayuanz3/MedSAM2_pretrain/tree/main . (2) I modified your train_2d.py script for evaluation. Specifically, I retained only the validation part and removed the training sections. (3) I loaded the pretrain weights with code:

ckpt_path = './checkpoints/MedSAM2_pretrain.pth'
checkpoint = torch.load(ckpt_path)
net.load_state_dict(checkpoint['model'])

net.eval()
tol, (eiou, edice) = function.validation_sam(args, nice_test_loader, epoch, net, writer)
  • however, the results on REFUGE is Total score: 1.5096757411956787, IOU: 0.0159607185616769, DICE: 0.026624170053027436.

(4) I also tried with using the args.pretrain without net.load_state_dict in the above:

# with -pretrain MedSAM2_pretrain.pth
Total score: 0.6400713324546814, IOU: 0.07327658690868426, DICE: 0.10690500849569073

# without -pretrain
Total score: 0.6306238174438477, IOU: 0.09716492249995469, DICE: 0.1419979241103477

I guess my results are incorrect. May I get any guidance from you?

Q2. Different SAM2 foundation size

  • Are the released pretrained weights MedSAM2_pretrain.pth suitable for different SAM2 foundation sizes? I noticed that you only included tiny and small in the code. Would it be feasible if I directly replace them with base or large variations?

  • actually I tried to use load_state_dict for small size but got incompatible parameter dimensions.

yaotingwangofficial avatar Jan 30 '25 06:01 yaotingwangofficial

check #9

yaotingwangofficial avatar Feb 13 '25 07:02 yaotingwangofficial

Dear Authours, 亲爱的盖世,

Many thanks for your efforts on this great work and releasing the code!非常感谢您为这个伟大的工作和发布代码所做的努力!

I have some questions that (I guess others may also be interested in):我有一些问题(我想其他人可能也会感兴趣):

Q1. how to load the pretrain weights correctly?

Q1。如何正确加载预训练权值? (1) I got the pretrain weight from: https://huggingface.co/jiayuanz3/MedSAM2_pretrain/tree/main . (2) I modified your train_2d.py script for evaluation. Specifically, I retained only the validation part and removed the training sections. (3) I loaded the pretrain weights with code:(3)我用代码加载预训练权重:(1)训练前权重来自:https://huggingface.co/jiayuanz3/MedSAM2_pretrain/tree/main。(2)我修改了你的train_2d.py脚本进行评估。具体来说,我只保留了验证部分,并删除了培训部分。

ckpt_path = './checkpoints/MedSAM2_pretrain.pth'
checkpoint = torch.load(ckpt_path)
net.load_state_dict(checkpoint['model'])

net.eval()
tol, (eiou, edice) = function.validation_sam(args, nice_test_loader, epoch, net, writer)
  • however, the results on REFUGE is Total score: 1.5096757411956787, IOU: 0.0159607185616769, DICE: 0.026624170053027436.然而,REFUGE的结果却是。

(4) I also tried with using the args.pretrain without net.load_state_dict in the above:(4)我也试过在上面的句子中使用without:

# with -pretrain MedSAM2_pretrain.pth
Total score: 0.6400713324546814, IOU: 0.07327658690868426, DICE: 0.10690500849569073

# without -pretrain
Total score: 0.6306238174438477, IOU: 0.09716492249995469, DICE: 0.1419979241103477

I guess my results are incorrect. May I get any guidance from you?我想我的结果是错误的。我可以得到你的指导吗?

Q2. Different SAM2 foundation size

Q2。不同的SAM2基础尺寸

  • Are the released pretrained weights MedSAM2_pretrain.pth suitable for different SAM2 foundation sizes? I noticed that you only included tiny and small in the code. Would it be feasible if I directly replace them with base or large variations?释放的预训练权重是否适用于不同的SAM2基础尺寸?我注意到您在代码中只包含了tiny和small。如果我直接用碱基或大的变体替换它们可行吗?
  • actually I tried to use load_state_dict for small size but got incompatible parameter dimensions.实际上,我尝试使用尺寸,但得到不兼容的参数尺寸。

Hello, have you solved this problem? I had the same problem. Looking forward to your reply, thank you.

liuna0420 avatar Mar 03 '25 02:03 liuna0420