Fast-Certified-Robust-Training icon indicating copy to clipboard operation
Fast-Certified-Robust-Training copied to clipboard

Other distributions for IBP init

Open nurlanov-zh opened this issue 2 years ago • 1 comments

Hi,

Have you tried other distributions for IBP init? Does it make difference? Since the IBP training has non-continuity at W_i = 0, would it make sense to initialize it with Laplace distribution W_i ~ Laplace(0, b)? Then |W_i| ~ Exponential(b^{-1}), so b = 2/n_i would also work.

Best regards, Zhakshylyk

nurlanov-zh avatar Nov 17 '22 10:11 nurlanov-zh

Hi Zhakshylyk,

I think we just assumed normal distribution. Sounds interesting to consider other distributions. Could you please explain a bit why making the distribution of W_i non-continuous at 0 would potentially help the training?

Thanks! Zhouxing

shizhouxing avatar Nov 17 '22 18:11 shizhouxing