SAM-Adapter-PyTorch
                                
                                
                                
                                    SAM-Adapter-PyTorch copied to clipboard
                            
                            
                            
                        Adapting Meta AI's Segment Anything to Downstream Tasks with Adapters and Prompts
I can't find the code of adaptor,could you please tell me which file include this code?
Hi SAM-Adapter developers, thanks for this great repo. I see that in configs/*.yaml, split_key for val_dataset is test, instead of val, which means validation set is actually not used at...
I want to use the pretrained-model to predict new images, could you provide the "predict" file?
Could you please tell me which model should I download in the process "Download the pre-trained SAM(Segment Anything) and put it in ./pretrained."
my yaml file:  and i use code to train: $ torchrun mytrain.py --config configs/demo.yaml but get mismatch errors!!!!!!!!!!!!!!!!!!!!!!!!!! 
Hi, when running the SAM2-Adapter training code, I notice that the name of mask decoder in SAM2-Adapter is `self.mask_decoder`, but in the original SAM2 the module is named `self.sam_mask_decoder`. Therefore,...
Have you conducted training with masks as prompts in videos? I noticed that your paper mentions related training.
Hi, I ran the code using "torchrun train.py --config configs/demo.yaml" For some reason, I got the Value error: B, Nt, E = q.shape ValueError: too many values to unpack (expected...
I've used the config files given in the SAM2-Adpater branch, but it gives an error regarding mismatching the configs with the SAM2 checkpoint. Could you please provide the config file...