ThreadsX.jl
ThreadsX.jl copied to clipboard
inference failure
I am not sure if this is to be expected, or reflects and underlying issue with Julia, and if there are any workarounds.
julia> VERSION
v"1.6.0-beta1.1"
julia> using ThreadsX, Test
julia> @inferred ThreadsX.mapreduce(identity, +, 1:3)
ERROR: return type Int64 does not match inferred return type Any
In principle, I can add type assertion to make the downstream code type-stable (even though we'd use dynamic dispatch internally). But there's a subtle difficulty due to that Task(f)
is equivalent to invokelatest
; i.e., the inference done at the current world age may not be compatible with the code called in the child tasks. I think I can use a workaround based on invoke_in_world
as discussed in https://discourse.julialang.org/t/can-we-have-inferable-fetch-task/45541. But I haven't got the time to implement it yet.
Since these functions are different functions, I think having a method that takes in the output type as an argument would be a nice stop-gap solution right now for when we know the output type.
Friendly bump: as far as I can see the approach outlined in tkf/InferableTasks.jl should work, is there a chance of integrating that?
I've stumbled upon the same issue with ThreadsX.map
, with type inference failing at this point similar to this and #182. Is there a quick workaround/alternative for this?
My understanding is that the development of the JuliaFolds eco-system is frozen for now since tkf has discontinued their involvement in this project. Development now continues under https://github.com/JuliaFolds2 which also hosts a fork of ThreadsX. I am not sure to which extend that is still being developed over there though.
That said, I suspect that the new OhMyThreads.jl may fix these issues. It shares a very similar high-level interface. Check out: OhMyThreads.tmap
for multi-threaded mapping.
Many thanks for the tip, OhMyThreads seems to do the job! :)