NYU-DLSP20 icon indicating copy to clipboard operation
NYU-DLSP20 copied to clipboard

The 15-transformer notebook has multiple issues with recent pytorch version

Open Carlsans opened this issue 6 months ago • 1 comments

The legacy version of torchtext is no longer supported. I had to install an old one. When trying to run this cell : model = TransformerClassifier(num_layers=1, d_model=32, num_heads=2, conv_hidden_dim=128, input_vocab_size=50002, num_answers=2) model.to(device)


I'm getting this error :

RuntimeError Traceback (most recent call last) Cell In[18], line 1 ----> 1 model = TransformerClassifier(num_layers=1, d_model=32, num_heads=2, 2 conv_hidden_dim=128, input_vocab_size=50002, num_answers=2) 3 model.to(device)

Cell In[17], line 5, in TransformerClassifier.init(self, num_layers, d_model, num_heads, conv_hidden_dim, input_vocab_size, num_answers) 2 def init(self, num_layers, d_model, num_heads, conv_hidden_dim, input_vocab_size, num_answers): 3 super().init() ----> 5 self.encoder = Encoder(num_layers, d_model, num_heads, conv_hidden_dim, input_vocab_size, 6 maximum_position_encoding=10000) 7 self.dense = nn.Linear(d_model, num_answers)

Cell In[11], line 9, in Encoder.init(self, num_layers, d_model, num_heads, ff_hidden_dim, input_vocab_size, maximum_position_encoding, p) 6 self.d_model = d_model 7 self.num_layers = num_layers ----> 9 self.embedding = Embeddings(d_model, input_vocab_size,maximum_position_encoding, p) 11 self.enc_layers = nn.ModuleList() 12 for _ in range(num_layers):

Cell In[10], line 17, in Embeddings.init(self, d_model, vocab_size, max_position_embeddings, p) 15 self.word_embeddings = nn.Embedding(vocab_size, d_model, padding_idx=1) 16 self.position_embeddings = nn.Embedding(max_position_embeddings, d_model) ---> 17 create_sinusoidal_embeddings( 18 nb_p=max_position_embeddings, 19 dim=d_model, 20 E=self.position_embeddings.weight 21 ) 23 self.LayerNorm = nn.LayerNorm(d_model, eps=1e-12)

Cell In[10], line 6, in create_sinusoidal_embeddings(nb_p, dim, E) 1 def create_sinusoidal_embeddings(nb_p, dim, E): 2 theta = np.array([ 3 [p / np.power(10000, 2 * (j // 2) / dim) for j in range(dim)] 4 for p in range(nb_p) 5 ]) ----> 6 E[:, 0::2] = torch.FloatTensor(np.sin(theta[:, 0::2])) 7 E[:, 1::2] = torch.FloatTensor(np.cos(theta[:, 1::2])) 8 E.requires_grad = False

RuntimeError: a view of a leaf Variable that requires grad is being used in an in-place operation.

Anyway, thank you for this great course. I'm really enjoying learning about deep learning :)

Carlsans avatar Dec 15 '23 02:12 Carlsans