haxe
haxe copied to clipboard
`() -> Void` should be haxe.Function
https://try.haxe.org/#FeD45a50
import haxe.Constraints.Function;
class Test {
static function main() {
trace("Haxe is great!");
}
}
enum abstract Events(Event<Function>) {
var Sleep = new Event<() -> Void>(0);
}
abstract Event<T:Function>(Int) {
public inline function new(name) {
this = name;
}
}
Error:
Test.hx:10: characters 14-38 : error: () -> Void should be haxe.Function
Test.hx:10: characters 14-38 : ... have: Event<() -> ...>
Test.hx:10: characters 14-38 : ... want: Event<haxe.Function>
(After playing with typed listener example from code.haxe.org)
This is a variance situation:
import haxe.Constraints.Function;
abstract Event<T:Function>(Int) { }
function main() {
var e1:Event<Function>;
var e2:Event<() -> Void>;
e1 = e2; // () -> Void should be haxe.Function
}
This general case would not be sound because you could potentially put something that isn't () -> Void into e2 by accessing it via e1.
However, I think your concrete example should work via top-down inference. It should be the equivalent to this code, which also fails at the moment:
import haxe.Constraints.Function;
abstract Event<T:Function>(Int) {
public function new(name) {
this = name;
}
}
function main() {
var e1:Event<Function> = new Event<() -> Void>(0);
}
After many silly years i finally found how to make strictly typed event listeners with haxe: https://gist.github.com/Simn/1f5f58d5f19466b8d2b7 (from https://github.com/HaxeFoundation/hxnodejs/issues/21) And its even better with haxe 4:
enum abstract MyEvent<T>(Int) {
var Sleep:MyEvent<(hours:Int, sound:String) -> Void>;
var Eat:MyEvent<() -> Void>;
}
class Main {
static function main() {
on(Sleep, (hours, sound) -> {});
on(Eat, () -> {});
}
static public function on<T>(e:MyEvent<T>, f:T) {
switch(e) {
case Sleep: f(1, "zzz");
case Eat: f();
}
}
}
Much wow! So happy! (really need to document it in that article)
About actual bug, e1:Event<Function> = new Event<() -> Void> doesn't help with my problem because real type will be losed, but i guess this is inference bug anyway, so it should stay open.