lightweight-neural-architecture-search icon indicating copy to clipboard operation
lightweight-neural-architecture-search copied to clipboard

This is a collection of our zero-cost NAS and efficient vision applications.

Results 21 lightweight-neural-architecture-search issues
Sort by recently updated
recently updated
newest added
trafficstars

Open MPI is not supported under windows.

Thanks for this amazing repo. I'm currently working on training an efficient low-precision backbone and deploying it on an ARM Cortex-M7 MCU device with limited resources (512kB RAM, 2MB Flash)....

As shown in the config file `config_nas.py`, is it only the V100 or t40 are supported here? If i want to use 2 * 2080Ti, how can i change that?

Thanks for sharing the code of such an interesting work! Just got a few questions when reproducing the code: 1. The [links](https://github.com/alibaba/lightweight-neural-architecture-search/blob/main/configs/classification/README.md?plain=1#L50) to the pre-trained image classification models (RXX-like.pth) seem...

in paper said that we can use cpu with small memory source ,but when i run `sh tools/dist_search.sh configs/classification/deepmad_29M_224.py` 32G memory was used in 10 sec and the job has...

The constraint limit the btn in https://github.com/alibaba/lightweight-neural-architecture-search/blob/6bf4d6949ed690b8ef59bcb843e2d36d03ebecd1/tinynas/spaces/mutator/super_res_k1kx_mutator.py#L72 https://github.com/alibaba/lightweight-neural-architecture-search/blob/6bf4d6949ed690b8ef59bcb843e2d36d03ebecd1/tinynas/spaces/mutator/super_res_k1kx_mutator.py#L82 but this https://github.com/tinyvision/DAMO-YOLO/blob/master/damo/base_models/backbones/nas_backbones/tinynas_nano_middle.txt seems to have broken the limit May I ask why? thanks

If I want to use the Deepmap method to create a specific parameter quantity resnet or create my own resnet101, how should I call the script? Is this open source

Hello, thanks for your great work. I run scripts/damo-yolo/example_k1kx_small.sh twice, two best_structure.txt results are different, and is different from your damo-yolo-s structure。 I wonder if it is normal?

hey, in code 'get_deepmad_forward' func, the deepmad compute way is log(sqrt(c*k^2/g)), [https://github.com/alibaba/lightweight-neural-architecture-search/blob/main/tinynas/models/blocks_cnn_2d/blocks_basic.py#L444] but in paper, the definition of the CNN entropy is without sqrt option, ![image](https://github.com/alibaba/lightweight-neural-architecture-search/assets/57970913/c219de51-99a2-4490-9d56-4b47cf6c82fd)