Sameer Wagh

Results 55 comments of Sameer Wagh
trafficstars

The weight initialization makes a big difference. Biases set to 0 are fine. For weights, ideally you would like to use Kaiming He initialization but you can use a more...

I think I know what's causing this. The 32 bit space is too small for the entire training. Try setting `myType` to `uint64_t` and increasing the fixed point precision to...

Can you print all the weights and activations for the first forward and backward pass? The weights seem already to have overflown. With 20 bits of floating precision, any integer...

I would recommend you don't print the entire sets. Print only the first 10 input, output values (and other values) for each layer. Right now the weights seem fine, deltaWeight...

Good chance the issue is caused because of non-normalized inputs. Can you try after converting the inputs between 0-1 as floats (by default MNIST has 0-255 range values)?

Right, can you now print the weights/input/output of each FCLayer? So until some work is done on automating this/building more software, unfortunately we're stuck with this "looks reasonable" debugging. To...

You're right that the algorithm leaks the power of 2 interval of the denominator. However, the important point is this leakage is quantified (notice how the functionalities in Figure 8,...

There seems to be some networking issue. - Can you see if the networking rules on the machines allow connections between them? See if you can ping one machine from...

[32000-32100](https://github.com/snwagh/falcon-public/blob/master/src/connect.cpp#L104-L105) should be a good range. One other thing that might be the issue here since these are servers running on each party, the ports are binding. So when you...

You'll have to download and parse the MNIST dataset yourself. There is some helper script provided in the [old SecureNN codebase](https://github.com/snwagh/securenn-public/tree/master/mnist), you can use that to transform the raw MNIST...