data-science-from-scratch icon indicating copy to clipboard operation
data-science-from-scratch copied to clipboard

There is no example for Stochastic Gradient Descent in Chapter 8

Open chengjun opened this issue 7 years ago • 1 comments

There is no example for Stochastic Gradient Descent in Chapter 8. I have tried to write one.

print("using minimize_stochastic_batch")

x = list(range(101))
y = [3*x_i + random.randint(-10, 20) for x_i in x]
theta_0 = random.randint(-10,10) 
v = minimize_stochastic(sum_of_squares, sum_of_squares_gradient, x, y, theta_0)

print("minimum v", v)
print("minimum value", sum_of_squares(v))

However, I would run into a problem of TypeError: sum_of_squares() takes 1 positional argument but 3 were given

chengjun avatar Oct 28 '18 08:10 chengjun

I'm not quite sure if the solution of the problem is still actual, but I hope this will help someone.

1. First of all, in the second edition of the book the stochastic gradient descent function was rewritten and the working example was added.

2. Regarding your example, the main problem is that you call the target function (sum_of_squares) in minimize_stochastic, giving it 3 arguments (x_i, y_i, theta), but sum_of_squares requires just one - the list of elements.

minimize_stochastic:

...
value = sum( target_fn(x_i, y_i, theta) for x_i, y_i in data )
...

sum_of_squares:

def dot(v, w):
    """v_1 * w_1 + ... + v_n * w_n"""
    return sum(v_i * w_i
               for v_i, w_i
               in zip(v, w))

def sum_of_squares(v):
    """v_1 * v_1 + ... + v_n * v_n"""
    return dot(v, v)

oogl avatar Jun 24 '19 14:06 oogl