flower
flower copied to clipboard
Add Flower Baseline: FedGen
Paper
Zhu, Zhuangdi & Hong, Junyuan & Zhou, Jiayu. (2021). Data-Free Knowledge Distillation for Heterogeneous Federated Learning.
Link
https://arxiv.org/abs/2105.10056
Maybe give motivations about why the paper should be implemented as a baseline.
Proposes a data-free knowledge distillation approach to address heterogeneous FL, where the server learns a lightweight generator to ensemble user information in a data-free manner, which is then sent to the users to regulate local training using the learned knowledge as an inductive bias.
Achieves better generalization performance compared with the state-of-the-art.
Is there something else you want to add?
No response
Implementation
To implement this baseline, it is recommended to do the following items in that order:
For first time contributors
- [x] Read the
first contribution
doc - [x] Complete the Flower tutorial
- [x] Read the Flower Baselines docs to get an overview:
Prepare - understand the scope
- [ ] Read the paper linked above
- [ ] Decide which experiments you'd like to reproduce. The more the better!
- [ ] Follow the steps outlined in Add a new Flower Baseline.
- [ ] You can use as reference other baselines that the community merged following those steps.
Verify your implementation
- [ ] Follow the steps indicated in the
EXTENDED_README.md
that was created in your baseline directory - [ ] Ensure your code reproduces the results for the experiments you chose
- [ ] Ensure your
README.md
is ready to be run by someone that is no familiar with your code. Are all step-by-step instructions clear? - [ ] Ensure running the formatting and typing tests for your baseline runs without errors.
- [ ] Clone your repo on a new directory, follow the guide on your own
README.md
and verify everything runs.