enas
enas copied to clipboard
TensorFlow Code for paper "Efficient Neural Architecture Search via Parameter Sharing"
Hi, we are working on neural architecture searching and very interested in ENAS. We have downloaded your code and run it. Now we try to convert the code to a...
Hi, I found that the training time of each step is getting slower during the training phase. It might because there are some new operations added to the graph after...
Hi, I try to figured out how skip connections works because in the paper we can see, that _If a layer receive skip connections from multiple layers before it, then...
I am confused with how many reduction (pooling?) layers a micro model should have. Figure 4 in the paper shows 1 reduction layer per 1 convolutional block, whereas in the...
Finally, I trained own datasets and save .meta files. Now I loaded the trained files and trying to inference on test datasets, but cannot find and load a last logits...
It says: "We first sample several models from the trained policy π(m, θ). For each sampled model, we compute its reward on a single minibatch sampled from the validation set....
changes to allow running with python3
There are some parameters I don't know what their meanings. Sucha as: micro_child.py - sync_replicas - num_aggregate - num_replicas Are these parameters used for multi gpus?
I am getting error when trying to run the enas with CIFAR10. The error is: **Failed to get device properties, error code: 30** Is that a problem with my GPU?...
Thanks to your kind answers, I'm trying to apply source code to other images. I am trying to apply a 256 * 256 image with two classes by modifying the...