data-free topic
                        List
                        data-free repositories
                    
                Data-Free-Adversarial-Distillation
                            
                                95
                            
                            
                        
                        Stars
                    
                            
                                18
                            
                            
                        
                        Forks
                    Watchers
                    Code and pretrained models for paper: Data-Free Adversarial Distillation
DIODE
                            
                                60
                            
                            
                        
                        Stars
                    
                            
                                6
                            
                            
                        
                        Forks
                    Watchers
                    Official PyTorch implementation of Data-free Knowledge Distillation for Object Detection, WACV 2021.
CMI
                            
                                65
                            
                            
                        
                        Stars
                    
                            
                                16
                            
                            
                        
                        Forks
                    Watchers
                    [IJCAI-2021] Contrastive Model Inversion for Data-Free Knowledge Distillation
ABD
                            
                                21
                            
                            
                        
                        Stars
                    
                            
                                1
                            
                            
                        
                        Forks
                    Watchers
                    [ICML2023] Revisiting Data-Free Knowledge Distillation with Poisoned Teachers