trax
trax copied to clipboard
Trax — Deep Learning with Clear Code and Speed
### Description Whenever I try to use GPU's with Kaggle Kernels it returns the following error ``` TypeError: pmap() got an unexpected keyword argument 'donate_argnums' ``` ### Environment information Kaggle...
[Question] Any way to increase the vocab_size and number of training samples for the reformer model?
### Description While I was experimenting with creating a language model using reformer, I noticed a few issues with the example: - The vocab size is really small? A larger...
### Description I'm running the following Collab for transformer classifier. It used to work on version 1.3.3, but now throws shape error after one iteration! https://colab.research.google.com/drive/1sNsYGeBsPQJLhSGKBur8C77jpMWH90Bm?usp=sharing
### Description In the snippet in the docs here ```python # Create a Transformer model. # Pre-trained model config in gs://trax-ml/models/translation/ende_wmt32k.gin model = trax.models.Transformer( input_vocab_size=33300, d_model=512, d_ff=2048, n_heads=8, n_encoder_layers=6, n_decoder_layers=6,...
OS: manjaro anaconda pip install trax Package 'dataclasses' requires a different Python: 3.7.9 not in '>=3.6,
i see keras can use crf layer easily.why trax don't have this layer, what's the purpose of it?
### Description I am trying locally to train an english to german language model with attention. Training and Validation loss seems logical. However whenever I try to predict next symbol...
I'm having trouble passing arguments to the convolution layer for just a single evaluation of the layer. So instead of building a network, I'd just like to pass in the...
### Description There are some problems when using the BERT class from trax/models/research/bert.py. The method new_weights(self, input_signature) in PretrainedBERT class is using `super().new_weights(input_signature)` to set the weights when it should...