RxInfer.jl
RxInfer.jl copied to clipboard
Missing prediction
I made a custom node and want to check the forwarded message out of that by using it in a model. This is the implementation:
struct MyNode end
@node MyNode Stochastic [out, in1, in2]
# rule specification
@rule MyNode(:out, Marginalisation) (m_in1::UnivariateNormalDistributionsFamily, m_in2::UnivariateNormalDistributionsFamily) = begin
min1, vin1 = mean_var(m_in1)
min2, vin2 = mean_var(m_in2)
return NormalMeanVariance(min1 + min2, vin1 + vin2)
end
@rule MyNode(:in1, Marginalisation) (m_out::UnivariateNormalDistributionsFamily, m_in2::UnivariateNormalDistributionsFamily) = begin
min2, vin2 = mean_var(m_in2)
mout, vout = mean_var(m_out)
return NormalMeanVariance(mout - min2, vout + vin2)
end
@rule MyNode(:in2, Marginalisation) (m_out::UnivariateNormalDistributionsFamily, m_in1::UnivariateNormalDistributionsFamily) = begin
min1, vin1 = mean_var(m_in1)
mout, vout = mean_var(m_out)
return NormalMeanVariance(mout - min1, vout + vin1)
end
@rule MyNode(:in1, Marginalisation) (q_out::Any, m_in2::UnivariateNormalDistributionsFamily) = begin
min2, vin2 = mean_var(m_in2)
return NormalMeanVariance(mean(q_out) - min2, vin2)
end
@rule MyNode(:in2, Marginalisation) (q_out::Any, m_in1::UnivariateNormalDistributionsFamily) = begin
min1, vin1 = mean_var(m_in1)
return NormalMeanVariance(mean(q_out) - min1, vin1)
end
@model function My_model(y)
A ~ NormalMeanVariance(2.0,1.0)
B ~ NormalMeanVariance(1.0,1.0)
y ~ MyNode(A,B)
end
result = infer(
model = My_model(),
predictvars = (y = KeepLast(), ),
)
But when I predict the y using:
result.predictions[:y]
it gives "missing". As an alternative, I also tried
result = infer(
model = My_model(),
data = (y = missing , ),
)
The result is the same.
I will repeat my message from Slack for better observability, we may fix it in the future, here is a workaround
julia> result = infer(
model = My_model(),
data = (y = UnfactorizedData(missing), ),
)
Inference results:
Posteriors | available for (A, B)
Predictions | available for (y)
julia> result.predictions
Dict{Symbol, NormalMeanVariance{Float64}} with 1 entry:
:y => NormalMeanVariance{Float64}(μ=3.0, v=2.0)
This is an unfortunate detail of the implementation. Basically due to the fact that y is treated as data entry automatic constraints makes structured factorization for your model, which means that your written BP rules are never called. Instead RxInfer attempts to compute joint marginal over A and B , but since data is missing there is no information to compute it, so the inference engine happily returns you missing as a result.
Not great at all, but `UnfactorizedData was an attempt to override the default behavior and force BP rules. Its referenced here and was a way to fix broken example where predictions were completely off due to similar issue
Hi! We are moving to a cleaner Epic -> Feature -> Task issue hierarchy to better organize our backlog. This issue is currently either underspecified or not tagged appropriately.
To keep this issue open, please do the following within the next 7 days (by 25-11-2025):
- Update/Replace: Ensure the description is clear and actionable.
- Tag Correctly:For Tasks/Features, add the correct label (e.g., feature, task) AND include a link to the Parent Epic or Feature it belongs to.
- For Bugs, add the Bug label. (Bugs do not require a parent link.)
Issues not updated, linked, or tagged correctly by the deadline will be closed and purged.
Thank you for helping us clean up and organize our backlog!