probabilistic-federated-neural-matching
probabilistic-federated-neural-matching copied to clipboard
question about multilayer PFNM
When I set the local networks as multilayer (for example, two hidden layers which both have 100 neurals) in the experiment.py and run the document, I find that only the number of the first hidden layer of global network is right and the numbers of other hidden layers are still 100! I have tried to find why but can't work it. I will appreciate it if someone can solve my problem!
Hi, I'm not sure I understand the issue fully. Can you please provide the following to help me diagnose the issue:
- Command to reproduce the experiment. Ex:
python experiment.py --args.... - Output log file, and
- What is the expected output and what's the output PFNM produces
Thanks!
Hi, I'm not sure I understand the issue fully. Can you please provide the following to help me diagnose the issue:
1. Command to reproduce the experiment. Ex: `python experiment.py --args....` 2. Output log file, and 3. What is the expected output and what's the output PFNM producesThanks!
Hello! First, thanks for your reply. When I run the single layer PFNM, the test accuracy is quite good. But when I run the multilayer PFNM, the performance is very poor. Next, I will give the three answers correspondingly.
1.Let the number of hidden layers be 2: python experiment.py --logdir "logs/mnist_test" --dataset "mnist" --datadir "data/mnist/" --net_config "784, 100, 100, 10" --n_nets 10 --partition "homo" --experiment "u-ensemble,pdm,pdm_iterative" --lr 0.01 --epochs 10 --reg 1e-6 --communication_rounds 1 --lr_decay 0.99 --iter_epochs 5
2.The log file is attached.
3.We can see that the test accuracy is only 0.8345, and the neurals' number of the second hidden layer is still 100, which means it did't get trained! Through experiments, I find the performance gets poor rapidly with the number of hidden layers increase(ex: When net_config is "784, 100, 100, 100, 100, 100, 100, 10", the test accuracy is only about 0.1...) experiment_log-0-1.log