piq
piq copied to clipboard
Pieapp Loss Score Range
Hello,
Thank you for your contribution. I am using Pieapp as a loss function with L1 loss. I think the optimal point that we are trying to reach is 0 for Pieapp Loss as 0 is when you do not have any diff. from the ref. image. Since the score can vary in score<0, and score>0 range for distorted images. Thus, we need to use it like abs(Pieapp()). What do you think?
Thanks.
Hi @beyzacevik Thanks for noticing. I'll open a PR to fix this.
Hello,
Thank you for your contribution. I am using Pieapp as a loss function with L1 loss. I think the optimal point that we are trying to reach is 0 for Pieapp Loss as 0 is when you do not have any diff. from the ref. image. Since the score can vary in score<0, and score>0 range for distorted images. Thus, we need to use it like abs(Pieapp()). What do you think?
Thanks.
Hi, i have a qustion about PieApp, i use it as an image quality assessment but for one of my (ref. image , pred.image) it gives me negative score (-0.0728). Q1: what is max and min value of pieApp for two given images? when i try (ref. image , ref. image) it gives me about -3 . Q2: lower is better? Q3: should i use abs() too?
thanks
@shshojaei Q3: Yes, use absolute value to avoid negative scores Q2: Generally yes, but for values close to zero metric isn't very stable / monotonic, so you can't confidently say that 1.5 is strictly better than 2.0 Q1: Max and min values are not defined, it's a neural-net based metric, so with some weird inputs final activations range can be quite high
@shshojaei Q3: Yes, use absolute value to avoid negative scores Q2: Generally yes, but for values close to zero metric isn't very stable / monotonic, so you can't confidently say that 1.5 is strictly better than 2.0 Q1: Max and min values are not defined, it's a neural-net based metric, so with some weird inputs final activations range can be quite high
thank you for your response.