DataPoisoning_FL icon indicating copy to clipboard operation
DataPoisoning_FL copied to clipboard

Code for Data Poisoning Attacks Against Federated Learning Systems

Results 6 DataPoisoning_FL issues
Sort by recently updated
recently updated
newest added

here, Malicious participants are not visible with mentioned settings. How to get output as in paper? ![defense_results](https://user-images.githubusercontent.com/85681572/121517521-ea09ec00-ca08-11eb-9863-fe653b9beaa8.jpg) _Originally posted by @AriesQa in https://github.com/git-disl/DataPoisoning_FL/issues/1#issuecomment-858541844_

Hi there, I am performing the label flipping attack feasibility, but how do I see the results according to Table (2) in the paper? Is it in the log? or...

Hi, Could you please solve this issue: I have changed the address for a model pass and export to : ![f](https://user-images.githubusercontent.com/101468951/218827998-91fa59c9-e4dc-4525-a6ad-7e4ea3365903.jpg) But I have this error still: What is the...

After running the command,it gives this error "Traceback (most recent call last): File "/content/DataPoisoning_FL/defense.py", line 69, in model_files = sorted(os.listdir(MODELS_PATH)) FileNotFoundError: [Errno 2] No such file or directory: '/absolute/path/to/models/folder/1823_models'

Hello, Thank you for your outstanding work on the FL defense mechanisms. I'm currently in the process of reproducing your implementation to gain a deeper understanding of how these defenses...