HWade

Results 1 comments of HWade

You need to compute and apply gradient seperately by the following process: ```python opt = tf.train.AdamOptimizer(0.1) gvs = opt.compute_gradients(logits) capped_gvs = [(tf.clip_by_value(grad, -1., 1.), var) for grad, var in gvs]...