deepIQA icon indicating copy to clipboard operation
deepIQA copied to clipboard

about result

Open liujiang137 opened this issue 6 years ago • 7 comments

I want to know the result score is more high if the picture is more clear.What is the range of the score interval? (when I run the code,there is many matter about xp. so I replace xp with np,This does not affect the results?)

liujiang137 avatar Dec 20 '18 13:12 liujiang137

Happy New Year, @liujiang137 !

I want to know the result score is more high if the picture is more clear.What is the range of the score interval?

According to the paper, "result scores lie in the range [0, 100]~~. A higher value indicates higher quality.~~, where a lower score indicates better visual image quality." " Please have a look at the paper for more details.

When I run the code,there is many matter about xp. so I replace xp with np,This does not affect the results?

What kind of errors/warnings are you experiencing?

wunderwuzzi1975 avatar Jan 07 '19 15:01 wunderwuzzi1975

"Resulting DMOS quality ratings lie in the range of [0, 100], where a lower score indicates better visual image quality." No?

tbf, it makes more sense based on the results that I'm having here.

violivei avatar Jan 21 '19 17:01 violivei

@violivei Your results are right. I got confused by the different scores used for the different databases. On page 6, right column, last paragraph it is stated that "to make errors and gradients comparable for different databases, the MOS values of TID2013 and CLIVE and the DMOS values of CSIQ have been linearly mapped to the same range as the DMOS values in LIVE". And the DMOS values in LIVE lie in the range of [0, 100], where a lower score indicates better visual image quality. I have edited my posting above to avoid any further confusion.

wunderwuzzi1975 avatar Jan 22 '19 14:01 wunderwuzzi1975

@wunderwuzzi1975 1.jpg 1 2.jpg 2

The result I get are: Patchwise The 1st result is 1.jpg The 2nd result is 2.jpg image

Weighted The 1st result is 1.jpg The 2nd result is 2.jpg image

chungyau97 avatar Apr 01 '19 05:04 chungyau97

@wunderwuzzi1975 excuse me, when i use the pretrain model(live dataset pretrain model) to test some picture ,the model will return 0.0034 or -0.003(return the number very close to zero,), could you tell me what is the problem? thanks

littleSpongebob avatar Jun 11 '19 14:06 littleSpongebob

also got negative results similar to @littleSpongebob

ladevan avatar Sep 18 '20 21:09 ladevan

@chungyau97 ,my result is the same with you,but I can't understand the quality of the 1st should be better,why the score is highter?

Ffmydy avatar Nov 16 '20 01:11 Ffmydy