adanet
adanet copied to clipboard
Keras API
As mentioned in discussion on Hacker News, native support for Keras's APIs (including layer support) would make implementation of adanet a lot easier for Keras-based projects (either via tf.keras
or external keras
).
@minimaxir Thank you for filing this feature request. We'll take a look.
/cc @jhfjhfj1
Without Keras support of project, it may have some trouble to visualize AdaNet produced construction, like use graphviz in plot_model Are there some replacement to implement this function ?
@svjack: You can visualize the learned architecture in TensorBoard by looking at the Graph tab.
@cweill I suggest, Can create a chapter to tell about the algorithm procedure among different iteration (with namespace control) with the help of tensorboard to visualize in the document you will release.
Any news on this?
We're working on a new Keras API for AdaNet, but don't have a timeline on it yet. Keep an eye out for related commits over the next few months.
The timeline will look roughly like so (subject to change):
- Support
tf.keras.Layers
in subnetworks. - Support
tf.keras.Sequential
and Keras functional API models inadanet.AutoEnsembleEstimator
. - Implement
adanet.keras.Model
andadanet.keras.AutoEnsemble
APIs to be Keras end-to-end.
Please reach out here or PM me if you are interested in testing out our new Keras support as we add it.
this is amazing!
We are going for an API that is both familiar, but also flexible enough to define an search space of keras models. Here's an idea of the kind of API we are going for:
submodel1 = Sequential()
submodel1.add(Dense(8, input_dim=4, activation='relu'))
submodel1.add(Dense(3, activation='softmax'))
submodel1.compile(loss='categorical_crossentropy', optimizer='adam')
submodel2 = Sequential()
submodel2.add(Dense(16, input_dim=4, activation='relu'))
submodel2.add(Dense(3, activation='softmax'))
submodel2.compile(loss='categorical_crossentropy', optimizer='adam')
model = adanet.keras.AutoEnsemble(
ensembler=adanet.ensemble.UniformAverage(),
ensemble_strategy=adanet.ensemble.AllStrategy(),
candidate_pool={
"Model1": submodel1,
"Model2": submodel2,
})
model.compile(loss='categorical_crossentropy', optimizer='adam')
model.fit(x=..., y=..., steps=100)
Feedback is welcome!
@cweill this looks great but is this already close to an alpha/beta(or merged into some branch already) where if I want to try this out I can go ahead and try it out, or you guys still working through this. any rough notebook with a broad idea of steps would be great.
- thanks
@atinsood: We don't currently have anything you can try out, since we're still coming up with a reasonable design. But keep an eye out, we'll be pushing some experimental code soon to an adanet.keras
subpackage soon.
I would love to use it when it comes out
When would this Keras API have a final distribution? I think it's vital for the widespread application of AdaNet.
@shendiaomo: This is still a work in progress, but we are actively pursuing this direction. It's all very experimental at the moment, since we would like it to fit in well with the rest of the Keras and TF 2.0 ecosystem, meaning good integration with other systems like keras-tuner.
We're working on an experimental AdaNet model search API. You can have a peek under adanet/experimental. We will be demoing this Tuesday Morning at the Google Booth at NeurIPS.
@minimaxir, @shendiaomo: adanet.experimental.ModelSearch
(a.k.a. adanet.ModelFlow
) is @csvillalta's API for AutoML using Keras and TF 2. Please take a look at in under adanet/experimental, and let us know what you think.
Be sure to try out the ModelFlow demo.