a1b2c3s4d4
a1b2c3s4d4
请问怎么进行多尺度训练和测试呀?谢谢
loading annotations into memory... loading annotations into memory... loading annotations into memory... loading annotations into memory... Done (t=15.55s) creating index... Done (t=15.76s) creating index... Done (t=15.56s) creating index... index created!...
您好,我现在用的是单卡3090,没有改动模型里的参数,复现结果是: This is your evaluation result for task 1 (VOC metrics): mAP: 0.7184670471952441 ap of each class: plane:0.8924992637024642, baseball-diamond:0.8166412996024346, bridge:0.5392068130234289, ground-track-field:0.7210017470814063, small-vehicle:0.6668245582289634, large-vehicle:0.821257238017224, ship:0.8659460437061555, tennis-court:0.9088855421686749, basketball-court:0.8568508433788743, storage-tank:0.8430793822881508, soccer-ball-field:0.5813415347777581, roundabout:0.4294881278300593, harbor:0.6725001390245908,...
  请问BBOXHEAD中smoothL1里面的target形式是(x,y,w,h,theta)吗?这个theta为什么不是【-pi, pi)范围内的呐?谢谢
你好, 请问Oriented R-CNN中将RPN生成的平行四边形proposal调整为有向矩形的代码在哪个文件里呀? 以及将有向矩形的表示转换成(x,y,w,h,theta)的代码具体是在那个文件呀? 谢谢!!
参数代表的含义
/DOTA_OBBDetection/mmdet/models/roi_heads/roi_extractors/obb/obb_single_level_roi_extractor.py文件中的 def forward(self, feats, rois, roi_scale_factor=None): 这里的feats是指什么呀? 下面是输出的rois,请问每一行的六个数是什么含义呐?感觉像是中点偏移表示中的(x,y,w,h,▲alpha, ▲beta ),但是数值又对不上。。。求解答 # tensor([[0.0000e+00, 1.8100e+02, 9.1600e+02, 2.8410e+01, 1.0993e+01, # -3.1059e+00], # [0.0000e+00, 7.3223e+02, 9.3639e+02, 2.1954e+01, 7.5611e+00, # -1.0460e+00], # [0.0000e+00, 7.9756e+02, 3.0058e+02,...
Excuse me, how should I visualize the heatmaps of classification tasks and localization tasks in object detection respectively? Can you give me some ideas? Thanks!!!
### Model/Dataset/Scheduler description 请问怎么把backbone换成vit?? 谢谢 ### Open source status - [ ] The model implementation is available - [ ] The model weights are available. ### Provide useful links for...
请问应该如何分别可视化目标检中分类任务和定位任务的cam?可以分享一下实现思路吗?谢谢
您好,我想用咱们的代码跑下soda数据集,我把BboxToolkit中DOTAv1.0的类别更改成了soda-a的类别,重新安装BboxToolkit,然后切分soda-a数据集中的图片。将切分后的图片作为数据集导入模型,但是在训练和测试时模型只能统计出第一类的gts,请问这是什么情况呀? 下面时测试的mAP显示:  下面时pkl文件中的类别id,很明显是有所有类别的标签的:   下面时切图片时测试集的日志文件:  能不能麻烦您帮忙解答下这个问题,谢谢?