NYU-DLSP20 icon indicating copy to clipboard operation
NYU-DLSP20 copied to clipboard

NYU Deep Learning Spring 2020

Results 51 NYU-DLSP20 issues
Sort by recently updated
recently updated
newest added

Author names were missing, now added.

Figure Number Issue in 14-1.md #461 Figure 5 -> Figure 19, Figure 6 -> Figure 20 So, the figure numbering goes like 4->19->20->7 and so forth. Please enlighten me if...

[ja] week05 ja translation by Jesmer Wong

Hey @atcold this is my work on self attention during the semester. Let me know if its useful.

**Performance Analysis Of Various Activation Functions On Different Architectures** Hello @atcold, I have been working on the notebook headed: Performance Analysis Of Various Activation Functions On Different Architectures Link: https://colab.research.google.com/drive/1tBO1MdC7rAipaiA5XpWDjrZ9M5o4u1Sv...

1- If you go to https://atcold.github.io/pytorch-Deep-Learning/en/week02/02-3/ 2-Search for "The activation of the last layer in general would depend on your use case, as explained in this Piazza post." 3- The...

In the paragraph **Self-Attention(I)** of [**Week 12/Attention and the Transformer**](https://atcold.github.io/pytorch-Deep-Learning/en/week12/12-3/) there is a little mistake after the definition of the hidden layer ![formula](https://render.githubusercontent.com/render/math?math=h=Xa ) as matrix multiplication: the vector ![formula](https://render.githubusercontent.com/render/math?math=a)...

Following line happens a coversion error in the 16th cell. with CUDA environment. It causes to set cuda type tensor to TSNE,fit_transform(). ` E.append(TSNE(n_components=2).fit_transform(X[-1]))` I passed to add ,cpu() calling....

#127 2020.03.24 `02-2` - [x] zh → #168 - [ ] ko - [ ] it - [ ] es #157 2020.03.27 `01-1` - [x] zh - [x] ko -...