Omega.jl
Omega.jl copied to clipboard
Implement `grad(x, ω, ForwardDiff)`
Implement gradient using ForwardDiff (and/or reverseDiff)
We need to structure LazyOmega and SimpleOmega
Mutation
In some areas of OmegaCore we rely on the fact that LazyOmega mutates. For instance look at RejectionSample
function condomegasample1(rng,
ΩT::Type{OT},
y,
alg::RejectionSampleAlg) where OT
@label restart
ω = ΩT()
ω_ = tagrng(ω, rng)
!y(ω_) && @goto restart
ω
end
There were a few reason for this initial design
- For an arbitrary model, we can't randomly construct ω because we don't know the dimension. In fact the dimension might vary as a function of some of the values of ω, in some sense. So the solution was to randomly construct ω lazily
It would be nice if anything that satisfied ω[(id, ExogenousVar)] was a valid Omega.
We'll be able to do this once we have robust contextual execution.
Basically we want some function of the form
"Produce concrete ω from rag"
priorsample(rng, x) = ... ω