federated-learning icon indicating copy to clipboard operation
federated-learning copied to clipboard

about the implementation of FedAvg

Open haoyu0408 opened this issue 3 years ago • 3 comments

Why does the FedAvg use a simple average without weight?

haoyu0408 avatar May 23 '21 02:05 haoyu0408

Why does the FedAvg use a simple average without weight?

Hi, I have the same question with you. Have you solve it ?

Grassyue avatar Jul 06 '21 07:07 Grassyue

I think maybe the num of samples (train + test) for each client are the same and therefore the weight for each client are the same. So we can directly average them. This is my viewpoint.

GuoJingtao-1997 avatar Jul 21 '21 08:07 GuoJingtao-1997

Why does the FedAvg use a simple average without weight?

Hi, I have the same question with you. Have you solve it ?

For FedAvg, loss=p1L1+...+pkLk, where pi=ni/n, Li=li/ni. I think this implementation changes it as nloss=L1+...+Lk. And for this loss function, it can use this implementation. For loss=p1L1+...+pk*Lk, you need to divide ni for each loss function in local clients and then weighted sum their parameters.

haoyu0408 avatar Sep 10 '21 15:09 haoyu0408