pml2-book
pml2-book copied to clipboard
Probabilistic Machine Learning: Advanced Topics
Hi. There is an out of place character q at the end of the following paragraph in 8.2.2.7. > The Kalman filter can be extended to work with continuous time...
Page 874, Eq. (25.45) and (25.46): the increment of the Wainer process is denoted as $d\omega$, which should be $dw$ to be consistent with the context.
Seems to me like the last term in Eq. 6.145 is missing an expectation E in front of the log(p(y|θ)). Should it not be E[log(p(y|θ))]? Amazing book ♥️
page 1072 "How ~~do we~~ does the interpretability happen?" in the penultimate line, "machxine" -> "machine" page 1080 "Banzaf score" -> "Banzhaf score"
In equation (34.101), I believe $y_{1:b}$ should be $y_{1:B}$, as capital $B$ is used elsewhere. Also, in the first line of this section, "a batch of $b$ candidate query points"...
"as well as the interative tutorial" -> "as well as the interactive tutorial" interative -> interactive 
I believe the highlighted \lambda_j in the paragraph below should be replaced with w_j, because the discussion is about sparsity of the weights. 
It seems that the symbol for the rate parameter of the prior on \tau is inconsistent: - The text and Figure 15.3 use \gamma - Equations 15-91 to 15.94 use...
The word "will" is repeated in the sentence below. > this is called a guided particle filter, and will will be better than the bootstrap filter
"approximating" should be "approximation" > then normalize these weights and compute the new approximating to the target posterior