Omega.jl icon indicating copy to clipboard operation
Omega.jl copied to clipboard

Implement `grad(x, ω, ForwardDiff)`

Open zenna opened this issue 5 years ago • 1 comments

Implement gradient using ForwardDiff (and/or reverseDiff)

zenna avatar Aug 26 '20 15:08 zenna

We need to structure LazyOmega and SimpleOmega

Mutation

In some areas of OmegaCore we rely on the fact that LazyOmega mutates. For instance look at RejectionSample

function condomegasample1(rng,
                          ΩT::Type{OT},
                          y,
                          alg::RejectionSampleAlg) where OT
  @label restart
  ω = ΩT()
  ω_ = tagrng(ω, rng)
  !y(ω_) && @goto restart
  ω
end

There were a few reason for this initial design

  1. For an arbitrary model, we can't randomly construct ω because we don't know the dimension. In fact the dimension might vary as a function of some of the values of ω, in some sense. So the solution was to randomly construct ω lazily

It would be nice if anything that satisfied ω[(id, ExogenousVar)] was a valid Omega. We'll be able to do this once we have robust contextual execution.

Basically we want some function of the form

"Produce concrete ω from rag"
priorsample(rng, x) = ... ω

zenna avatar Jul 12 '21 21:07 zenna