Mohammed Innat
Mohammed Innat
In the Colab, only `.predict_on_batch` is being tested, right? I was curious about `test_on_batch` as well. Also, was there any mention of why `.predict_on_batch` is significantly slower? It might be...
@sonali-kumari1 IMO, this should be backend agnostic. The split method should not use tf ops with torch backend, instead it should have backend specific splitting method. https://github.com/keras-team/keras-hub/issues/2128
@khteh You should know the `keras.utils.plot_model` mainly used for inspect the layer connection more closely rather than less about visual appealing. Plotting the model with LR mode is not convenient,...
@AdonaiVera You can take a look into this [implementation ](https://github.com/sayakpaul/swin-transformers-tf)(2d-swin), it reproduced original implementaiton (sort of - though there are some [limitaiton](https://github.com/sayakpaul/swin-transformers-tf/issues/9), which you might care to fix).
By supporting more than one optimizer in high level api will bring complexities, intead override the compile method and pass as many optim as needed.
cc. @fchollet
DynamicBackend, no! https://github.com/keras-team/keras/issues/20918
@t-kalinowski Thanks for the confirmation. I understood that using torch/jax backend, ops will operate with respected backends. So, I was wondering whether if it's possible to set dynamic backend for...
@t-kalinowski I just noticed, check this preprocessing layer [here](https://github.com/keras-team/keras/blob/960133e97d56d3715b747e4e55614bcc4c8f6fda/keras/src/layers/preprocessing/data_layer.py#L96-L107). It sets tensorflow backend as locally while in the tf graph.
I got some funny issue. I tried to train a keras model where I used `torch.utils.data.DataLoader` and model fit worked. The thing is **I accidently didn't switch to torch backend.**...