enas icon indicating copy to clipboard operation
enas copied to clipboard

TensorFlow Code for paper "Efficient Neural Architecture Search via Parameter Sharing"

Results 85 enas issues
Sort by recently updated
recently updated
newest added

I want to use ENAS arch to inference my own dataset, so I modified the original code to use placeholder input e.g) ``` loss, lr, gn, tr_acc, _ = sess.run(run_ops,...

Here is my question: after training, what should be the output of the nas? Is that the best one ever during the training or the last architecture produced by the...

When I try to run the cifar10_mirco_search.sh , some errors happened: File "/home/zhangjiabin/enas/src/cifar10/micro_child.py", line 602, in _enas_conv zero_init = tf.initializers.zeros(dtype=tf.float32) AttributeError: 'module' object has no attribute 'initializers' It seems caused...

What is the mean about the “whole_channel is True”? Looking forward to your reply!Thanks

Based on the output of micro_search(`num_cell=2`), for example: `[0 0 1 0 0 4 2 0]` `[0 3 0 1 1 4 0 1]` According to the explanation, each block...

If we run the three experiments from the README: ``` # Exp. 1 ./scripts/ptb_search.sh ./scripts/ptb_final.sh # Exp. 2 ./scripts/cifar10_macro_search.sh ./scripts/cifar10_macro_final.sh # Exp 3. ./scripts/cifar10_micro_search.sh ./scripts/cifar10_micro_final.sh ``` what should we expect...

How does the child_fixed_arc are created on the final scripts Is that use the controller trained on the search phase to output the child_fixed_arc?

Resolves https://github.com/melodyguan/enas/issues/4

Thank you for providing us with the code. I'm running the cifar10_macro_search.sh The code, data, hyper parameter were taken as is and not modified. tail of cifar10_macro_search.sh(in the stdout file)...

I have read the enas paper,which says that weight sharing is used for efficiency. Any one can tell me which part of the code realize the weight sharing?