François Chollet
François Chollet
> for Dropout: keras.layers.Dropout(seed=123). SeedGenerator is built in so that each invocation gives a different dropout mask. Yes > for a layer with weights is keras.layers.Dense(kernel_initializer=keras.initializers.RandomNormal(seed=123)). SeedGenerator not built in...
Sure, I'm open to having a `split_seed` backend op of some kind. Then we can use it in `SeedGenerator.next()`.
> Thank you. I guess we can refactor our seed generator like this: That sounds good at a high level, but the `split_seed` method should be a backend function instead,...
Thanks, Sachin. It looks like relevant torch tests are failing, can you take a look? https://github.com/keras-team/keras/actions/runs/10931265787/job/30345878233?pr=20270
@sachinprasadhs can you please take a look at the torch test failure?
The behavior you are seeing is completely normal... it's a result of the weirdness of: 1. Using "loss" as the monitor. Never do this -- it's meaningless. The loss will...
> Updated my online dataset generator to use keras.Sequential data augmentation instead of the removed ImageDataGenerator. Are you using `tf.data`? That's what you want to use to see good performance...
Thanks for the PR -- please take a look at the test failure: > FAILED keras/src/layers/preprocessing/discretization_test.py::DiscretizationTest::test_discretization_basics - AssertionError: expected output dtype int64, got float16
@divyashreepathihalli Divya, are you still working on this?
Thanks for the PR. I think this issue calls for a different fix -- what's the best way to serialize bytes (to go in a JSON file) in the general...