FedML icon indicating copy to clipboard operation
FedML copied to clipboard

How to customize the federated learning algorithm?

Open Alan-JW opened this issue 2 years ago • 6 comments

Hi @chaoyanghe, If I want to customize the federated learning algorithm in simulated version instead of using the existing algorithms like fedavg, fednova, etc., what should I do? t=Thanks for your answer !

Alan-JW avatar Aug 24 '22 13:08 Alan-JW

@Alan-JW Here is a blog introducing our customization APIs: https://medium.com/@FedML/fedml-releases-simple-and-flexible-apis-boosting-innovation-in-algorithm-and-system-optimization-b21c2f4b88c8

You can also find an example at: https://github.com/FedML-AI/FedML/tree/master/python/examples/cross_silo/mpi_customized_fedavg_mnist_lr_example

And many examples using customization APIs at: https://github.com/FedML-AI/FedML/tree/master/python/app

If customized trainer and aggregator cannot meet your requirements, you can use FedML FLow API. Here is an example: https://github.com/FedML-AI/FedML/blob/master/python/fedml/core/distributed/flow/test_fedml_flow.py

chaoyanghe avatar Aug 24 '22 16:08 chaoyanghe

Thanks, i got it!

Alan-JW avatar Aug 24 '22 16:08 Alan-JW

@chaoyanghe How do you author new algorithms to be cross-compiled to different platforms e.g. Android & IOS?

The Blog Post seems to indicate you can subclass FedMLExecutor and write arbitrary Python code to be executed on the nodes. Presumably this works in situations where each node is capable of running arbitrary Python code.

I assume this is not the case on mobile. The diagram in the blog post indicates the on device training engine is PyTorch or TensorFlow. So is the idea to use Python code to construct a computation graph using either PyTorch or TensorFlow and then export it to a PyTorch/TensorFlow which can be executed using the corresponding engine?

How does one author FedML programs that can be compiled into PyTorch/TensorFlow graphs?

How does MobileNN fit in here? Is MobileNN an alternative to using PyTorch/TensorFlow as an engine? Or is MobileMNN the actual engine?

jlewi avatar Aug 24 '22 18:08 jlewi

@jlewi this is a great question. You can find our engine architecture here: https://github.com/FedML-AI/FedML/tree/master/android.

For Android, we've developed our engine on MNN and PyTorch Mobile. The MobileNN is an adaptor to adapt our APIs to different mobile engine. Based on MobileNN, the Java SDK is the key part of FL local training, related communication protocol, etc.

Previously, we only provide Android SDK. Given that many people ask details for this part, we plan to release all source code soon.

chaoyanghe avatar Aug 25 '22 00:08 chaoyanghe

So is the idea to use Python code to construct a computation graph using either PyTorch or TensorFlow and then export it to a PyTorch/TensorFlow which can be executed using the corresponding engine?

Yes. We can allow users to define the model with python code and distribute the computational graph to mobile devices. By this way, people don't need to handle mobile programming (Java, NDK, C++, etc.)

chaoyanghe avatar Aug 25 '22 00:08 chaoyanghe

So is the idea to use Python code to construct a computation graph using either PyTorch or TensorFlow and then export it to a PyTorch/TensorFlow which can be executed using the corresponding engine?

Yes. We can allow users to define the model with python code and distribute the computational graph to mobile devices. By this way, people don't need to handle mobile programming (Java, NDK, C++, etc.)

How, exactly? What is the process of adapting a model in Python to Android/iOS like? Could you provide more details?

Thanks.

SichangHe avatar Mar 20 '23 22:03 SichangHe