Performance implications of default monomorphization of free effect variables
In monomorpher, if we find an unconstrained effect variable x, we replace it with Pure. This is sound and fine.
The potential issue is that if the overall type is !x, we have freely chosen to give an expression the effect of Universe, meaning it will receive full algebraic effect runtime instrumentation. This is not good for code size or performance.
We could explore how to choose our free effect variables optimally.
I would like to work on this issue!
Do we have an example program where this happens? or is this more of a theoretical concern?
just theoretical from my side, maybe the first step is to see if it happens.
unconstrained effects would be pretty rare, so unbound contravariant effects probably more so
Do we have an example program where this happens? or is this more of a theoretical concern?
I'll see if I can find a case where this does (or perhaps as may be the case, doesn't) happen
Upon further consideration, I couldn't think of a function (let alone use) that would have a univeral exclusion. It is possible there is a case I'm not considering, because there dosen't seem that there would be a case where one can remove an arbitrary (but not neccasarry total) part of an effect
I think its hard to construct an example where this occurs. At least in the general form.
I think
def example(x: Int32): Int32 \ ~ef = checked_ecast(x)
def main(): Int =
let _ = _ -> example(42);
42
Will be made to have the universe effect by monomorph, but its also unused so..
Oh and also, you HAVE to pick pure, because of this bug #10313