TinySAM
TinySAM copied to clipboard
Loading Q-TinySAM
Hello!
I came across two issues loading Q-TinySAM.
First one is running demo_quant.py
looking for normal checkpoints due to model type being vit_t
, is this the proper type for loading quantized weights? Because due to this, during model loading it doesn't look for quantization layer and fails.
Demo error:
Traceback (most recent call last):
File "/content/TinySAM/./demo_quant.py", line 7, in <module>
from demo import show_mask, show_points, show_box
File "/content/TinySAM/demo.py", line 31, in <module>
sam = sam_model_registry[model_type](checkpoint="./weights/tinysam.pth")
File "/content/TinySAM/tinysam/build_sam.py", line 90, in build_sam_vit_t
with open(checkpoint, "rb") as f:
FileNotFoundError: [Errno 2] No such file or directory: './weights/tinysam.pth'
When I try to infer myself with no demo, I get following error:
---------------------------------------------------------------------------
ModuleNotFoundError Traceback (most recent call last)
[<ipython-input-7-04f0ec6a92d3>](https://localhost:8080/#) in <cell line: 19>()
17
18 cpt_path = "./tinysam/tinysam_w8a8.pth"
---> 19 quant_sam = torch.load(cpt_path)
20
21 device = "cuda" if torch.cuda.is_available() else "cpu"
2 frames
[/usr/local/lib/python3.10/dist-packages/torch/serialization.py](https://localhost:8080/#) in find_class(self, mod_name, name)
1413 pass
1414 mod_name = load_module_mapping.get(mod_name, mod_name)
-> 1415 return super().find_class(mod_name, name)
1416
1417 # Load the data (which may in turn use `persistent_load` to load tensors)
ModuleNotFoundError: No module named 'quantization_layer'
Hi,
The original codes in demo_quant.py
try to import show_mask, show_points, show_box
from demo.py
, thus it tries to run demo.py
. That why you encounter the above error, since no checkpoint tinysam.pth
is found.
It is not elegant so we have just updated demo_quant.py
to include the implementation of show_mask, show_points, show_box
. Now the demo_quant.py
should work well and you can have a try.
As for the error of ModuleNotFoundError: No module named 'quantization_layer'
, you could simply add sys.path.append("./tinysam")
in your codes so that the tinysam.quantization_layer
could be found.