backdoor_federated_learning
backdoor_federated_learning copied to clipboard
A program problem
It seems to me that poison_dataset() didn't poison the data, it just sampled 64 images 200 times and required them not to be images from "posion_image" and "poison_images_test". But what does that have to do with data poisoning.