Manuel
Manuel
Have a look to https://github.com/manuelsh/chat-bot
I believe you should permute the dimensions to be [batch_size, 6, 6, 256] before doing the view. See for example the original implementation of the authors: https://github.com/Sarasra/models/blob/984fbc754943c849c55a57923f4223099a1ff88c/research/capsules/models/capsule_model.py#L68
What are your results if you do that? Our experience so far seems that with the correct view the results are actually worse. (!) Btw, we are using cosine annealing...
You can change the hyper parameters of the model. They currently are: `embedding_dimension = 1024 num_layers = 3 rnn_dim = 2048` Just increase the values and that should do it....
Sorry @hoagy-davis-digges I confused your message, I thought you were asking about one language model that I built. On the layernorm model, unfortunately, cannot comment.
Loosing the output of a Jupyter notebook if you are not connected to the hosting computer is a problem, although you can do some hacks. Having the capacity of recovering...
May I recommend: ``` def Frobenius(mat): assert len( mat.shape )==3, 'matrix for computing Frobenius norm should be with 3 dims' return torch.sum( (torch.sum(torch.sum((mat ** 2), 2), 1) ) ** 0.5...
Hi @NhienLam, could you please review? Thanks
Thanks for looking to the PR! If I follow the code in the readme file, and add the `firebaseConfig` and the `firebase.initializeApp(firebaseConfig)` where it says it should go, I get...
Oh what you mean is when in the readme file it says: ``` * TODO(DEVELOPER): Paste the initialization snippet from this dialog box: * Firebase Console > Project Settings >...