java
java copied to clipboard
Reworked Layers Phase 1
These are the first set of model layers, fairly simple ones without back propagation.
I have also revamped the TestSession
, GraphTestSession
and EagerTestSession
to use lambdas for common operations between the Graph
and Eager
modes. This will need to be updated with #272, "Tensor.dataToString" when that is merged.
This PR supersedes #65, "Adding layers (based on keras) supporting multiple outputs"
This PR is dependent #191 "TensorFormat enum". I temporary created a copy of TensorFormat until #191 is merged.
The layers framework supports multiple inputs and outputs, but some layers are restricted to single input/output by design, (e.g. Dense).
This revision moved the Ops
parameter out of the CTORs, and creates a new method, init(Ops)
, and adds an Ops
parameter to the call method [e.g. call(tf, ...)
]. see issue #202.
SequentialLayersTest tests chains layers together like in the Sequential Model.
@karllessard the ops.pb
conflict is because I moved the nn.raw
ops to nn
, and moved the higher level nn
ops to framework
. This is a generated file so I am not sure how to resolve this conflict.
@deansher , any chance you can take a look at this work?
BTW @JimClarke5 , you'll need to rebase it. For the ops.pb, normally it should have no impact that you moved nn.raw
to nn
since this change is at the API definition level, which should not be part of ops.pb
. Maybe just reset that file to the original version on master
and discard it from your PR
Thank you, @karllessard, for inviting me to review. I have lately been exploring different aspects of deep learning that don't leave me the time to participate in tensorflow/java for now.
@JimClarke5 there is a lot of conflict in this PR, I think it's because we have merged a few of your other PRs before it, can you please rebase so I can start to take a look at it?