Gordon Erlebacher
Gordon Erlebacher
How about we move this to the discussions?
Thank you! Quick question on something I never understood. If a variable is observed, its probability distribution is quite irrelevant, is that not true? However, when estimating a likelihood, its...
Thanks! One question: you wrote: ``` python @RV letter_ ~ Nonlinear{Sampling}(g, g=letter, n_samples=50, dims=[(2,),(2,),(2,)]) ``` Shouldn't this be written as ``` python @RV letter_ ~ Nonlinear{Sampling}(g, g=letter, n_samples=50, dims=[(2,),(2,)]) ```...
Great. I ran the code, and it does not work. But I have a question. You wrote: ``` julia data = Dict(:s =>[1.0, 0.0], :l => [0.0, 1.0]) ``` Why...
You are correct, @albertpod ; I had not specified the dimensionality of the placeholders. I did not copy/paste your code, but instead modified mine. That is how I learn faster....
All good points. Actually, I am not convinced of convergence either. Marginals look fine. To check my marginals over iterations, I would have to output data using a callback function....
Thank you for this explanation. I will try all this out!
I was about to ask for this feature but saw this thread first. I just started using your extension and love it! But immediately I wanted to check my comments...
Any news on this issue? I work on a Mac with M2 check and have a similar error. The last line of my stack trace is: ``` Error raised by...
I thought the softmax was done on dim=1. The 0th dimension is the batch dimension.