pyprobml
pyprobml copied to clipboard
Mixture of Linear Experts does not appear to use EM
https://github.com/probml/pyprobml/blob/master/notebooks/book1/13/mixexpDemoOneToMany.ipynb
The random initialization of parameters and weights occurs within the E-Step (instead of before it), so it's not really making use of the fit parameters from the M-Step in each subsequent iteration.
#E-step :
np.random.seed(iteration)
Wy = 0.1*np.random.randn(D, K)
bias = 0.3*np.random.randn(D, K)
mixweights = np.random.rand(1, K)
normmw = np.linalg.norm(mixweights)
mixweights = mixweights/normmw
sigma2 = 0.1*np.random.randn(1, K)
I observed this same issue today. Running with max_iter = n
and iteration = 0
gives the same result as max_iter = n
and iteration = n-1
.