adanet
adanet copied to clipboard
how to extract the weights for each weak learner in adanet
Hello,
is there a way to extract/retrieve the weights for each weak learner (i.e., sub-network) in adanet? For example, how to obtain the values of w1, w2 and w3, if there are 3 weak learners in the ensemble?
Also, how to extract the trained weaker learner object individually (e.g., sub-network 1, 2, 3) instead of an ensemble whole?
Thanks and appreciated,
@wangzhishi: You can get the weights w from TensorBoard, under the mixture_weight_norms
section. As far as extracting the individual subnetworks, that is currently not easily doable possible, since we only expose ensembles. You could get all the trainable variables and filter the subnetwork vars according to their name in the scope, but that's a bit of extra work.
Is there anything you need in particular from the subnetwork.
@cweill I want to look at how the sub-network evolves during the iteration process, in my customized defined weak learners. I found a way to retrieve trained subnetwork related parameters via a suite of their names.
With this said, I wonder if there is a way to use fixed weights for the weak learners. With learn_mixture_weights = False
, I will only get equal weights, say (1/3, 1/3, 1/3)
. Is there a way to use fixed weights like (0.1, 0.2, 0.7)
without learning them? Thanks!
Is there a way to use fixed weights like (0.1, 0.2, 0.7) without learning them? Thanks!
Not directly, however one option would be to create the ComplexityRegularizedEnsembler(optimizer=...)
with a custom tf.train.Optimizer subclass. Then you could overwrite the Optimizer#minimize
method to assign the var weights you want via var_list
.
Or even simpler: subclass ComplexityRegularizedEnsembler
, and overrite the build_train_op
method to assing the weights you want to var_list
.
Either of those options would allow you use whatever weights you want.