Phil Tomson

Results 37 comments of Phil Tomson

debugging this a bit, it calls utils.detatch() and ends up at: ``` -> return tuple(detach(v) for v in h) /home/phil/devel/ENAS-pytorch/utils.py(131)detach() -> return tuple(detach(v) for v in h) > /home/phil/.virtualenvs/p3.6/lib/python3.6/site-packages/torch/tensor.py(380)__iter__() ->...

If I'm understanding the response to this issue: https://github.com/carpedm20/ENAS-pytorch/issues/22 PyTorch 0.4.1 is not supported (needs to be 0.3.1?)

Just an FYI for folks wanting to run this code under versions of Pytorch >= 0.4.0: In trainer.py you should not use the call to utils.detach() - instead you should...

Just to make sure I'm understanding: when you say first slope you're referring to the part of the graph from 0 to 60.00K on the x axis? And the 2nd...

@nkcr were you able to better results with the parameters from the TF code?

@dukebw : You say above: " This re-training is not implemented yet." that was from March 14, has the retraining been implemented since then?

Also, re the last reply: it's not clear to me from reading the paper if that picking the best model of the samples and then retraining happens every epoch or...

I notice in that example, though, that they specify the non-type values in the wrapping: ``` types.add_type("NonTypeParam") .apply(WrapNonTypeParam()); ``` So they supply the values 1, 2 and 64 to NonTypeParam....

Would that be adding the following to the Make.user? > USE_SYSTEM_LLVM = 0 > USE_LLVM_SHLIB = 0

Tried rebuilding julia with those options in the Make.user and oddly enough I'm able to use CLArrays even though prior to that I had tried: Pkg.test("OpenCL") and that segfaulted. But...