Deniz Oktay
Deniz Oktay
TODO on line 716 of moe/optimal_learning/python/python_version/expected_improvement.py
moe/tests/optimal_learning/python/python_version/expected_improvement_test.py test_multistart_mmonte_carlo_expected_improvement_optimization only works for specific seed, due to very few mc iterations (potentially)
Relevant file: moe/optimal_learning/python/python_version/optimization.py optimize() method of LBFGSBOptimizer. if you check out like multistart_hyperparameter_optimization, it can pass back a status dict. and in the REST interface, that status dict is written...
quoting eliu: it'd be nice to have some docs (could even be in a separate document somewhere) explaining where 1e-5 and 2000*u.size came from. Like mvndst docs recommends 1000 \*...
Hi all, I was wondering whether it is possible to do selective activation checkpointing with the LayerNormMLP where we only recompute FFN1 and not FFN2, therefore not having to save...
Hi all, Here is a proposed patch for something that we ran into internally. I have a note on testing below. Let me know if there is anything I can...