pytorch-book icon indicating copy to clipboard operation
pytorch-book copied to clipboard

pytorch autograd线性回归问题

Open scially opened this issue 7 years ago • 1 comments

我参考书的3.2节,但是最后wb总是Nan。 我的pytorch是1.0

import torch as t
from matplotlib import pyplot as plt
%matplotlib inline
t.manual_seed(1000)
def get_x_y():
    size = 30
    x = t.rand(size, 1) * 20
    y = x * 2 + (1 + t.rand(size,1)) * 3
    return x, y

w = t.rand(1,1, requires_grad=True)
b = t.rand(1,1, requires_grad=True)
for _ in range(100):
    x, y = get_x_y()
    y_pred = x * w + b

    loss = 0.5 * (y_pred - y) ** 2
    loss = loss.sum()

    loss.backward()
    
    # 检验求导是否正确
    # print(x.t().mm (y_pred - y) )
    # print(w.grad)
    w.data.sub_(0.001 * w.grad.data)
    b.data.sub_(0.001 * b.grad.data)
    w.grad.data.zero_()
    b.grad.data.zero_()
print(w, b)
tensor([[nan]], requires_grad=True) tensor([[nan]], requires_grad=True)

scially avatar Apr 02 '19 15:04 scially

当我把损失函数换为:

 loss = t.mean((y_pred - y) ** 2)

可以收敛。

scially avatar Apr 03 '19 15:04 scially