bhack
bhack
Xla has custom call if the high level op is not lowered efficiently or the the specific backend is not supported: https://www.tensorflow.org/xla/custom_call
https://github.com/tensorflow/models/issues/7381 https://github.com/tensorflow/tensorflow/issues/56225
Have you tried with: ```python import tensorflow as tf class MyModel(tf.keras.Model): def __init__(self, initializer): self.initializer = initializer super().__init__(inputs=[], outputs=[]) def get_config(self): return {"initializer": self.initializer} mymodel = MyModel(initializer=tf.keras.initializers.TruncatedNormal()) tf.keras.models.save_model(mymodel, "mymodel.sm", overwrite=True)...
Do you want to do something like: ```python import tensorflow as tf from tensorflow import keras # Define a subclassed model with the same architecture class MyModel(keras.Model): def __init__(self, output_dim,name=None,...
I don't see a subsclassed model there and the initializer is used in a custom layer there. So how do you have derived your "minimal" subclassed model gist? Also mine...
I suppose this is very similar to your initial post or not: https://colab.research.google.com/gist/bhack/f15b04a8181774c72254a8f72485fc4f/untitled129.ipynb
Yes it was just to understand your use case, but what I meant why you need to use `super().__init__(inputs=[], outputs=[])`? As it is seems to me unrelated from the ISSUE...
I think 1 run cause init accept kwargs. But I don't find a subclassed model definition like your (1) also in tests: https://github.com/keras-team/keras/blob/master/keras/tests/model_subclassing_test.py
We need to wait the next SIG meeting to understand what we want to do with new arch wheels.
The last SIG meeting was skipped few days ago. /cc @seanpmorgan