FedNLP icon indicating copy to clipboard operation
FedNLP copied to clipboard

About large model

Open xiang-xiang-zhu opened this issue 3 years ago • 1 comments

I want to know how do you maintain the parameters of each large model (such as Bert) in the process of federated learning, such as the fedavg algorithm? Because before server aggregation, if you run federated learning locally, you need to save many model parameters in memory

xiang-xiang-zhu avatar Dec 08 '21 07:12 xiang-xiang-zhu

From my understanding, they are held in CPU memory of server, and aggregations are carried out using the model state dictionaries

sauravpr avatar Dec 17 '21 07:12 sauravpr