Milan Straka
Milan Straka
@Hyperparticle As for the need of an RNN decoding, we have been evaluating lemmatizer on user noisy data (with typos, missing diacritics and such), and in such context the RNN...
Adding the author of the commit removing documented functionality: @jonycgn .
Alternatively, some variant of `write_scalar_summaries` could be moved to `on_train_batch_end` method of `TensorBoard` callback.
Hi, thanks for answer! The commit actually did remove functionality from `TensorBoard` callback -- the callback just sets correct `tf.summary.record_if` in https://github.com/keras-team/keras/blob/46121eed08d0feef743eacef3b66206df45cf656/keras/callbacks.py#L2362-L2373 but it relied on the `write_scalar_summaries` executed in...
> One thing to note is also that these logs were _not_ actual batch-level summaries, which was a reason to remove them. They are accumulated from the beginning of the...
> I disagree on the cumulative summaries front – I find it useful to have instantaneous batch summaries to get a sense of how much the loss function fluctuates. You...
Personally, I am against this. Why having two 'fromArray' functions? Also, what is the difference between `fromArrayMonolithic` and `rnf . fromArray`? The second one is a trivial way of having...
Personally I am not happy about exporting two fromArray functions, so I would agree with modifying `fromArray`. Nevertheless, we are not using `primitive` for this -- every dependency costs something....
Well it is my mistake I did not insist on proposal to libraries@haskell before merging the first `fromArray`. Not going to happen in the future :-) Nevertheless, we are probably...
The bureaucratic process you are referring to has a reasonably good meaning -- on one side, it allows the community to express its oppinion, and on the other side, it...