open-metric-learning
open-metric-learning copied to clipboard
Some changes for OML 3.0
CHANGELOG
- Moved category wise metrics calculation logic from
EmbeddingMetrics
to functional metrics. - Removed
fnmr@fmr
metric fromEmbeddingMetrics
because we cannot guarantee correctness of its behaviour when postprocessor is presented and the metric is computationally heavy. - Reworked handling empty bboxes (use one None instead of 4 Nones)
-
calc_retrieval_metrics_on_full
,calc_gt_mask
,calc_mask_to_ignore
,apply_mask_to_ignore
finally moved to tests to serve as adapters between the old and the new ways of computing metrics - pipelines: a bit of refactoring and improved type hints
- added
show
argument toRetrievalResults.visualise()
@AlekseySh
Removed fnmr@fmr metric from EmbeddingMetrics because we cannot guarantee correctness of its behaviour when postprocessor is presented and the metric is computationally heavy.
Is it possible to keep this metric available as an option if user wants to use it, because this metric is considered the most important metric in the field of biometrics (e.g., face/fingerprint recoginition) where score thresholding is often employed?
Having said that, it is also super helpful if this metric (or any other "the-lower-the-better" metrics) can be specified to metric_for_checkpointing like "OVERALL/fnmr@fmr/0.001" with the "mode" as "min" via YAML. The mode is hard-coded as "max" in the current implementation:
def parse_ckpt_callback_from_config(cfg: TCfg) -> ModelCheckpoint:
return ModelCheckpoint(
dirpath=Path.cwd() / "checkpoints",
monitor=cfg["metric_for_checkpointing"],
mode="max",
save_top_k=1,
verbose=True,
filename="best",
)
@deepslug okay, I got your point
Changes from this PR have been moved to other PRs:
- https://github.com/OML-Team/open-metric-learning/pull/568
- https://github.com/OML-Team/open-metric-learning/pull/567
- https://github.com/OML-Team/open-metric-learning/pull/566