bootcamp_machine-learning
bootcamp_machine-learning copied to clipboard
Day00Ex02 Example - extra example for loss_elem_ if needed
- Day: 00
- Exercise: 06(intra)07(github)
Example 3 seems off to me, but the problem could come from my code.
The subject shows
x2 = np.array([[0.2, 2., 20.], [0.4, 4., 40.], [0.6, 6., 60.], [0.8, 8., 80.]])
theta2 = np.array([[0.05], [1.], [1.], [1.]])
y_hat2 = predict_(x2, theta2)
y2 = np.array([[19.], [42.], [67.], [93.]])
# Example 3:
print(loss_elem_(y2, y_hat2))
# Output:
# array([[10.5625], [6.0025], [0.1225], [17.2225]])
# Example 4:
print(loss_(y2, y_hat2))
# Output:
# 4.238750000000004
And personnaly I get the following result:
# Example 3:
print(loss_elem_(y2, y_hat2))
array([[1.3203125], [0.7503125], [0.0153125], [2.1528125]])
My code to reproduce my results:
def loss_elem_(y, y_hat):
cost_func = lambda y, y_, m: (1 / (2 * m)) * (y - y_) ** 2
res = np.array([cost_func(i, j, len(y)) for i, j in zip(y, y_hat)])
return res
def loss_(y, y_hat):
return np.sum(loss_elem_(y, y_hat))
Fixed on:
- [ ] Github
- [ ] Gitlab
Example seems to have been removed. I transform the issue as an improvement if one wish to add this one as a new example