Jeff Pasternack
Jeff Pasternack
It's easy enough to pass the previous node's classification back into the model as an input feature; this will generally be less efficient than an RNN-based decoder-like apparatus (as your...
One way to approach would be to use collections or lists in the Struct to capture sets or sequences of contextual features; they can then be processed using list-wise transformations....
I meant list-wise transformations to process the features in the DAG (outside the neural network). Within the neural network, one option would be to model the context as a sequence--e.g....
1. WRT NNLastVectorInSequence, this may be a masking issue due to NNSplitVectorSequenceLayer not preserving the masking (the Javadoc for NNSplitVectorSequenceLayer mentions an exception as a possible result), although it's not...
1. `withInputFromVectorSequence(...)` can be passed anything that produces something of type `Iterable
WRT `ReshapeMasklessVertex`, this should be fixed in the most recent (late August) Dagli JAR with the addition of a private constructor for the Java deserializer to use. Were you still...
Serializing: standard Java serialization does serialize private fields. I'm unable to replicate any issue, but I'm also testing atop several bug-fix commits not in beta8--I can't see how they could...
That's very curious. I'm assuming the number of examples per iteration size is at least twice the number of examples per minibatch? Also, could you please share the line of...
I checked the code and the conversion from "minibatches" to "iterations" is just an identity function, and the check against frequency is straightforward, so there's no obvious bug; I also...
To clarify, you mean a CRF model regularizing the output of a neural network? Unfortunately, I don't think DeepLearning4J supports this. It may be possible to implement using DL4J's SameDiff...