zhangxinlu16

Results 4 comments of zhangxinlu16

> log According to the definition of cross entropy, the result should be non-negative. Is marian-scorer's result a negative cross entropy?

> Hi, The per-sentence score is just a sum of log probabilities, so technically a negative cross-entropy. Taking exp(score) you would get the total sentence probability in normal probability space...

@emjotde I am confused about the relationship between --mini-batch-fit and --maxi-batch. if I use --mini-batch-fit, What is the use of --maxi-batch?

Thanks for you answer ! But if I use --mini-batch-fit, is --mini-batch useless? If so, why use this setting in the script?(--mini-batch-fit -w $WORKSPACE --mini-batch 1000 --maxi-batch 1000 )