Peyton
Peyton
## Motivation 1. Support unroll method in the search phase of darts. 2. Support DDP training. 3. Add docstring for darts algorithm. Results:  Docstring coverage:  TestCase coverage: 
## Motivation - [x] Move `build_arch_param` function from `DiffMutableModule` (including `DiffMutableOP`, `DiffChoiceRoute` and `GumbelChoiceRoute`) to `DiffModuleMutator`. - [x] Rename cls.data to cls.structures. For more information see: https://github.com/open-mmlab/mmclassification/pull/941 ## Modification -...
@JiahuiYu Thank you for your great work, I have some questions about slimmable network using mixed precision. How to use Mixed Precision when training the Slimmable Network ? How to...
Thank you for your great job. I have a question about the calculation of importance. Here in Once for all, the importance is calculated by the input dimension. https://github.com/mit-han-lab/once-for-all/blob/cfa0722d57e3a2391eb36b8cf613dd17ff7a32ae/ofa/imagenet_classification/elastic_nn/modules/dynamic_layers.py#L263 But...
Thank you for your great job! I have some questions about the shortcut design of FFN in AutoFormer. https://github.com/microsoft/Cream/blob/83a154beb6f85dd6141853b4b7c0738eeec628ba/AutoFormer/model/supernet_transformer.py#L245-L246 To match the dimension setting of FFN, `sample_embed_dim` should be the...
First of all, thank you for your contribution. I have a lot of questions about the function of your project. When testing, I pass an image, then Gaussian blurring, and...
after run code: ``` cd ranking_nats python get_model_score_nats.py ``` I got: ``` kendall tau begin BossNAS: KendalltauResult(correlation=-0.534180602248828, pvalue=0.0) (-0.7180607093955225, 0.0) SpearmanrResult(correlation=-0.7341493538551311, pvalue=0.0) ```
Another paper EVALUATING THE SEARCH PHASE OF NEURAL ARCHITECTURE SEARCH tested FairNAS on NASBench101 but get the Kendall Tau of -0.23. FairNAS using 13 models to evaulate the rank and...
I wonder why the image size for cifar is 224? Is this just for convenience?
## Description Modified based on open-mmlab/mmrazor/dev-1.x and wutongshenqiu/mmrazor/bignas branch. This PR includes codes about BigNAS. ## Modification 1. Add dynamic ops including DynamicPatchEmbed, DynamicRelativePosition, DynamicLayerNorm, DynamicMultiheadAttention. 2. Add corresponding mixins...