ErrorException("Variable getindex is not found in Inputs = ...
Hi
When I execute my code, Julia (or ParallelAccelerator) says that OptFramework failed to optimize function.
FYI: bi2de() is a function that converts a binary array that LSB (least significant bit) is on the left to a decimal number.
FYI: You may test bi2de() with [0 1 0 1], like bi2de([0 1 0 1]) -> 10.
Here is my code:
using ParallelAccelerator
@acc function bi2de(binaryArray)
# Option 1 - convert to 1-D Julia Array
#binaryArray = reshape(binaryArray, length(binaryArray));
# Option 2 - convert to 1-D Julia Array
binaryArray = binaryArray[:];
z = 2 .^ (0:1:length(binaryArray)-1);
return sum(binaryArray .* z)
end
result = bi2de([0 1 1 1])
println(result)
With option 1 enabled, the code works with no errors and warnings.
But, when option 2 is enabled, Julia (or ParallelAccelerator) says that OptFramework failed to optimize function.
OptFramework failed to optimize function ##bi2de#13891 in optimization pass ParallelAccelerator.Driver.toParallelIR with error ErrorException("Variable getindex is not found in Inputs = \nbinaryArray, \nStatic Parameters = Base.Zip2{Array{Any,1},Array{Any,1}}(Any[symbol("##T#13903"),symbol("##T#13904")],Any[Int64,Int64])\nReturn type = Int64\nVarDefs\n CompilerTools.LambdaHandling.VarDef(:binaryArray,Array{Int64,1},18,-1)\n CompilerTools.LambdaHandling.VarDef(:z,Array{Int64,1},18,-1)\n CompilerTools.LambdaHandling.VarDef(symbol("##A#13892"),Tuple{Int64,Int64},0,-1)\n CompilerTools.LambdaHandling.VarDef(symbol("##args#13893"),Tuple{Int64,StepRange{Int64,Int64}},0,-1)\n CompilerTools.LambdaHandling.VarDef(symbol(""),Int64,18,0)\n CompilerTools.LambdaHandling.VarDef(symbol(""),Int64,18,1)\n CompilerTools.LambdaHandling.VarDef(symbol(""),Int64,18,2)\n CompilerTools.LambdaHandling.VarDef(symbol(""),Int64,18,3)\n CompilerTools.LambdaHandling.VarDef(symbol(""),StepRange{Int64,Int64},18,4)\n CompilerTools.LambdaHandling.VarDef(symbol(""),Array{Int64,1},18,5)\n CompilerTools.LambdaHandling.VarDef(symbol(""),Int64,18,6)\n")
Is 'converting 2-D Julia Array to 1-D Julia Array with [:] (e.g. A[:])' not able to be optimized by ParallelAccelerator?
I would appreciate your comments.
TL; DR: We don't support linear indexing right now and fixing that will probably take some time. Thanks for the bug report and I suggest you stick to reshape for now because what you're doing I would guess really isn't the intended purpose of linear indexing. I'd be curious how much linear indexed is actually used. [http://docs.julialang.org/en/release-0.4/devdocs/subarrays/]
using ParallelAccelerator
ParallelAccelerator.DomainIR.set_debug_level(3)
ParallelAccelerator.ParallelIR.set_debug_level(3)
@acc function bi2de(ba)
ba = ba[:]
end
result = bi2de([0 1 1 1])
println(result)
Below you can see the return type is correct but the type of SSAValue(0) is not...it is still 2D. It might not be that performant but is it possible @ninegua to do a DomainIR conversion from linear to cartesian indexing with an added reshape?
Starting main ParallelIR.from_expr. function = ##bi2de#271 ast = (Inputs = ba, Return type = Array{Int64,1} VarDefs CompilerTools.LambdaHandling.VarDef(Symbol("#self#"),###bi2de#271,0x00,1) CompilerTools.LambdaHandling.VarDef(:ba,Array{Int64,2},0x12,2) CompilerTools.LambdaHandling.VarDef(Symbol("##ba@3#274"),Array{Int64,2},0x02,3) CompilerTools.LambdaHandling.VarDef(Symbol(""),Array{Int64,2},0x12,0) ,:(begin _3 = _2 SSAValue(0) = $(Expr(:select, :(_3::Array{Int64,2}), :($(Expr(:range, 1, 1, :((Base.arraysize)(_3::Array{Int64,2},1)::Int64)))))) _3 = SSAValue(0) return SSAValue(0) end::Array{Int64,1}))
Frankly, I wasn't even aware that array selections can be used to change dimension. I can add a translation in domain IR to turn it into reshape. They are not entirely equivalent, but I believe reshape is more flexible.