Grokking-Deep-Learning icon indicating copy to clipboard operation
Grokking-Deep-Learning copied to clipboard

Chapter 13 - I have a problem trying to implement the autograd in a simple linear regresion

Open HectorPulido opened this issue 4 years ago • 0 comments

I tried this, mi input size is 1, 1000 and my output size is 1, 100

import random as r

x = np.array(range(1000))
y = x * 12 + 15
y = y + np.random.randn(*y.shape)

x = x.reshape(-1 , 1)
y = y.reshape(-1 , 1)

data = Tensor(x, autograd=True)
target = Tensor(y, autograd=True)

w = list()
w.append(Tensor(np.random.rand(1,1), autograd=True))
w.append(Tensor(np.random.rand(1), autograd=True))
for i in range(10):

  pred = data.mm(w[0]) + w[1].expand(0,1000)
  loss = ((pred - target)*(pred - target)).sum(0)

  loss.backward(Tensor(np.ones_like(loss.data)))
  for w_ in w:
    w_.data -= w_.grad.data * 0.1
    w_.grad.data *= 0
  print(loss)

OUTPUT

[4.20028134e+10] [1.86120338e+26] [8.24726275e+41] [3.65448202e+57] [1.61935411e+73] [7.17559347e+88] [3.17960978e+104] [1.40893132e+120] [6.2431795e+135] [2.7664436e+151]

HectorPulido avatar Jul 06 '20 06:07 HectorPulido