bootcamp_machine-learning
bootcamp_machine-learning copied to clipboard
ML01 ex03: Wrong loss_elem_ examples
- Day: 06
- Exercise: 03
Given examples of loss_elem_
are 10 times greater than it should.
The subject says:
# Example 0.1:
lr1.loss_elem_(y, y_hat)
# Output:
array([[710.45867381],
[364.68645485],
[469.96221651],
[108.97553412],
[299.37111101]])
But the loss is "only" at 195...
# Example 0.2:
lr1.loss_(y, y_hat)
# Output:
195.34539903032385
With my code, I got this for loss_elem_
and I have the right loss_
:
array([[71.04586738],
[36.46864549],
[46.99622165],
[10.89755341],
[29.9371111 ]])
The same issue is repeted for lr2
, subject loss_elem_
:
# Example 1.2:
lr2.loss_elem_(y, y_hat)
# Output:
array([[486.66604863],
[115.88278416],
[ 84.16711596],
[ 85.96919719],
[ 35.71448348]])
Subject loss:
# Example 1.3:
lr2.loss_(y, y_hat)
# Output:
80.83996294128525
My code:
array([[48.66660486],
[11.58827842],
[ 8.4167116 ],
[ 8.59691972],
[ 3.57144835]])
Fixed on:
- [ ] Github
- [ ] Gitlab