Using tf.keras with Tensorflow.jl
Hello
I want to switch from Python to Julia for doing my everyday research. However, my main concern is regarding the existence of libraries in Julia for research in deep learning.
I want to know if there is any possibility of using tf.keras API with Tensorflow.jl.
Thanks in advance.
"my main concern is regarding the existence of libraries in Julia for research in deep learning."
The official MXNet, as of 1.4.0 incorporated, as the Julia API, this previously unofficial wrapper: https://github.com/dmlc/MXNet.jl
I suppose Google should do the same for this TF wrapper. Also note e.g. Flux.jl and Knet.jl.
"I want to know if there is any possibility of using tf.keras API with Tensorflow.jl."
I don't know too much about this but at least since late February there's a "Keras demo" included in this with Tensorflow.jl: https://github.com/malmaud/TensorFlow.jl/commit/50d0659a2055d946e37a7eeaadd144be818079d0#diff-5807cd28c9698f00f2f8e4f6d4ef6659
and see the docs here on "eager mode" (and on how to check out that branch) in relation to there:
https://www.tensorflow.org/guide/keras "tf.keras is TensorFlow's implementation of the Keras API specification. This is a high-level API to build and train models that includes first-class support for TensorFlow-specific functionality, such as eager execution"
See also my search here for Keras: https://github.com/malmaud/TensorFlow.jl/search?q=Keras&unscoped_q=Keras
with only ResourceApplyKerasMomentum and ResourceSparseApplyKerasMomentum in the big file.
From another issue, as of Nov 2018: TF "contains approximately all the features of the python API excluding the contrib submodule." I'm not sure using Keras with TF is newer. It was used before TF 2.0 alpha made it an official API of TF?
Thank you so much for your response.
It is very good news to hear that Mlxnet.jl has been adopted as part of the project Mlxnet. Certainly, it can significantly promote the use of Julia, and I think that authors will replicate the Gluon API, so it could directly be used from Mlxnet.jl!!
Regarding Tensorflow.jl, take in mind that the success of Keras API is mainly due to its simplicity, allowing rapid prototyping and providing a lot of helper classes and predefined models (Inceptions, Resnet, etc), and this has attracted a lot of contributors and researchers. It would be wonderful to see in future Tensorflow.jl being adopted by Google and extending it to the implementation of tf.Keras API, but of course, by using pure Julia language.
PD: Yes, Flux.jl and knet.jl are promising, but in my opinion, they are still at an early stage.
take in mind that the success of Keras API is mainly due to its simplicity, allowing rapid prototyping and providing a lot of helper classes and predefined models (Inceptions, Resnet, etc),
We have a keras submodule in https://github.com/malmaud/TensorFlow.jl/blob/master/src/keras.jl| It is right now fairly minimal, but it works to do the things it does. PRs to add more are welcome. It has some of the the helper classes you mention. It does not have pretrained models as we do not currently have the ability to import weights etc from python tensorflow. Only graph structure (metagraphs). This could likely be changed if someone implemented a loaded for that file format python tensorflow uses for storing these things. We've not really looked into that sence before TensorFlow 1.0 days. (and I personally do not have time these days).
of course, by using pure Julia language.
TensorFlow.jl does not use pure julia language. It leverages the libtensorflow C API. (it is kind of the point). If we were making a pure julia ML framework it would not look like TensorFlow. It would look like Flux. I would not characterize Flux (or KNet) as early stage, for most purposes they are complete. I say this as a maintainer of TensorFlow.jl: give Flux a proper chance. I'm using Flux more than TensorFlow.jl these days.