CAT-Net icon indicating copy to clipboard operation
CAT-Net copied to clipboard

About class weight

Open huytuong010101 opened this issue 3 years ago • 6 comments
trafficstars

Hi @CauchyComplete again, have nice day! In your paper and in your training code, I see that it's fivefold more weight on the tampered class (0.5/2.5 in your code) I wonder that if it depend on the dataset? (number of tampered image and authentic image) If it depend on number of tampered image and authentic image, how can I calculate this ratio? Thank you for your reply <3

huytuong010101 avatar Dec 13 '21 06:12 huytuong010101

Hi, I chose the class weights by the number of authentic and tampered pixels throughout the datasets. To be specific, # auth pixels : # tamp pixels = tamp class weight : auth class weight. But you don't have to follow this protocol. It's up to you.

CauchyComplete avatar Dec 14 '21 05:12 CauchyComplete

Hi, I chose the class weights by the number of authentic and tampered pixels throughout the datasets. To be specific, # auth pixels : # tamp pixels = tamp class weight : auth class weight. But you don't have to follow this protocol. It's up to you.

Thank you, do you have plan to public the code using to generate the custom dataset?

huytuong010101 avatar Dec 16 '21 03:12 huytuong010101

Okay, I would like to support your work of implementing CAT-Net with Tensorflow. I will upload the tampCOCO and JPEG RAISE datasets tomorrow to Baidu cloud. I cannot read Chinese but Baidu cloud gave me 105GB of storage so I may use it. Other English-supporting drives do not offer that much storage as far as I know. If I fail to upload them, I'll personally send you via e-mail or something :)

CauchyComplete avatar Dec 16 '21 12:12 CauchyComplete

Okay, I would like to support your work of implementing CAT-Net with Tensorflow. I will upload the tampCOCO and JPEG RAISE datasets tomorrow to Baidu cloud. I cannot read Chinese but Baidu cloud gave me 105GB of storage so I may use it. Other English-supporting drives do not offer that much storage as far as I know. If I fail to upload them, I'll personally send you via e-mail or something :)

I really happy to hear that, Thank you so much <3

huytuong010101 avatar Dec 17 '21 08:12 huytuong010101

@huytuong010101 I've just uploaded all custom datasets used in the paper to Google Drive. The datasets exceeded 105GB so I couldn't upload them to Baiduyun. Fortunately, I found that my university account gave me unlimited Google Drive storage, so I uploaded them there. Refer to README.md (Front page of this repo). Plus, if you finish the implementation, I kindly recommend disclosing the code publicly.

CauchyComplete avatar Dec 17 '21 10:12 CauchyComplete

Yes, thank you so much, it really helpfull

huytuong010101 avatar Dec 18 '21 10:12 huytuong010101