bootcamp_machine-learning icon indicating copy to clipboard operation
bootcamp_machine-learning copied to clipboard

Loss is inconsistent accross exercises

Open JBarmentlo opened this issue 4 years ago • 0 comments

  • Day: 01
  • Exercise: 03 04

In ex03 the MSE loss is excpected to be divided by 2 (the pretty derivation trick) In ex04 it is not, calling the loss function from the previous exercise leads to failing the example. Considering this phrase from the subject You are strongly encouraged to use the class you have implement in the previous exercise. The same loss function should work in both.

Proposed solution:
Use the MSE not divided by two throughout. The fact that we add an arbitrary division by two because we can (any increasing linear application can be applied to the loss with no effect) just so the derivative looks prettier might be confusing to some.

Fixed on:

  • [ ] Github
  • [ ] Gitlab

JBarmentlo avatar Nov 18 '21 13:11 JBarmentlo