Federated-Learning-PyTorch
Federated-Learning-PyTorch copied to clipboard
How to realize communication and "federated"?
- I wonder why can I run "federated_main.py" on only one GPU (stand alone deployment). Because I got the acc.png and loss.png, so I believed that I do run this .py successfully, is that right? Does the codes and experiments involve communication period? Can this be called federated learning?
- If so, which sentences of the codes realize the communication?
- How to get the information( specific figuresf) of its communication time and the volume of communication data?
Looking forward to somebody's reply. Millions of thanks!
1.为什么我能在单机上跑通 "federated_main.py"文件?因为我在单台服务器上运行依旧得到了loss.png和acc.png,所以我认为我应该是跑通了。但这其中有没有通信?能算真正的联邦吗? 2.如果可以的话,到底是哪行代码实现的通信呢? 3.怎么能够获得通信时间和通信数据量这些信息? 期待热心网友的解答 谢谢!
- I believe this project mainly illustrates the effectiveness of federated learning so that you can run the code locally and watch FL's performance in different settings. Surely, FL needs to consider the communication between servers (if any) and participants in practical applications.
- In this project, the 'communication' is simplified to the function 'average_weights'. The meaning of the function is the selected participants send their local models to the server and receive the average model.
- The communication overhead is complex because it depends on the aggregation protocol and encryption method. But you could estimate the time or other information by yourself if you consider a specific protocol and encryption method. In fact, communication in FL is still a hot research field now (2022).
Wish it may help you :)
这个项目的联邦学习是不是只是对联邦学习可行性的证明?中间模型参数的聚合没有用到加密算法?
是的 没有用到 只是单机模拟哟
---Original--- From: @.> Date: Thu, Nov 10, 2022 15:59 PM To: @.>; Cc: @.@.>; Subject: Re: [AshwinRJ/Federated-Learning-PyTorch] How to realizecommunication and "federated"? (Issue #34)
这个项目的联邦学习是不是只是对联邦学习可行性的证明?中间模型参数的聚合没有用到加密算法?
— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you authored the thread.Message ID: @.***>
好的,感谢感谢