trax
trax copied to clipboard
Relu layer nested in its own Serial combinator by default
Description
Activation layers appear to be nested inside their own Serial combinators by default. Is there a reason for this?
Also found it mentioned on Stack Overflow with a workaround but no answer: https://stackoverflow.com/questions/68177221/trax-tl-relu-and-tl-shiftright-layers-are-nested-inside-serial-combinator
Example
import trax.layers as tl
model = tl.Serial(tl.Dense(32),
tl.Relu(),
tl.Dense(1))
print(model)
# Output
Serial[
Dense_32
Serial[
Relu
]
Dense_1
]
# Expected output
Serial[
Dense_32
Relu
Dense_1
]
...
Environment information
OS: Debian 10
$ pip freeze | grep trax
trax==1.3.9
$ pip freeze | grep tensor
mesh-tensorflow==0.1.19
tensorboard==2.6.0
tensorboard-data-server==0.6.1
tensorboard-plugin-wit==1.8.0
tensorflow==2.6.0
tensorflow-datasets==4.4.0
tensorflow-estimator==2.6.0
tensorflow-hub==0.12.0
tensorflow-metadata==1.2.0
tensorflow-text==2.6.0
$ pip freeze | grep jax
jax==0.2.20
jaxlib==0.1.71
$ python -V
Python 3.7.3
Sorry, also mentioned in #1660 apparently. But also there, a workaround but no answer. Just curious if this is the intended behaviour or not!