PyTorchStepByStep icon indicating copy to clipboard operation
PyTorchStepByStep copied to clipboard

The use of "log" vs "ln", chapter 3 - Loss

Open minertom opened this issue 3 years ago • 2 comments

I hope that this is not trivial. I was confused by it for a while so I thought that I should bring it up.

Normally, in engineering, when I see Log, without any base, it is assumed to be logarithm to the base 10. In the following from chapter 3, Loss first_summation = torch.log(positive_pred).sum()

Printing this first summation I noticed the value = tensor(-0.1054) I was going though the math and realized that this is not equal to log 10 of .9, which is -.045.

Going to the pytorch documentation I saw that "log Returns a new tensor with the natural logarithm of the elements of input."

Of course, in the "From Logits to Probablilties" there is shown the relationship which "kind of" hints towards natural logarithms or log to the base e, but the whole confusion can be avoided by using the symbol "ln" as opposed to "log".

Do you agree?

Thank You Tom

minertom avatar Dec 03 '20 17:12 minertom

Hi Tom,

I see you're moving quite fast :-) Thanks for the feedback - you do have a point - I will put this in my to-do list for the final revision.

I guess each field has its own defaults for log... in ML, I've never seen log base 10 or log base 2, it is always natural log. Since I come from a CS background (and it is always base 2 in information theory), it bugs me a bit to see 0.69 for log(2) instead of 1 bit :-)

Best, Daniel

dvgodoy avatar Dec 04 '20 19:12 dvgodoy

On Fri, Dec 4, 2020 at 11:36 AM Daniel Voigt Godoy [email protected] wrote:

Hi Tom,

I see you're moving quite fast :-)

Well, this is not my first rodeo :-) https://www.facebook.com/photo?fbid=1838315666307553&set=gm.2081015166079337

Thanks for the feedback - you do have a point - I will put this in my to-do list for the final revision.

I guess each field has its own defaults for log... in ML, I've never seen log base 10 or log base 2, it is always natural log. Since I come from a CS background (and it is always base 2 in information theory), it bugs me a bit to see 0.69 for log(2) instead of 1 bit :-)

Best, Daniel

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/dvgodoy/PyTorchStepByStep/issues/12#issuecomment-738976971, or unsubscribe https://github.com/notifications/unsubscribe-auth/ADHGGHC6FUHV3A35M4XEEKLSTE227ANCNFSM4UMJ5Z3Q .

minertom avatar Dec 04 '20 23:12 minertom