IntelliJ-Luanalysis
IntelliJ-Luanalysis copied to clipboard
Incorrect type deduction of generic functions with alias type arguments.
Environment
Name | Version |
---|---|
IDEA version | 2021.2.1 (Community Edition) - Build #IC-212.5080.55 |
Luanalysis version | v1.3.0 |
OS | Windows 10 |
Preferences
Lua
Name | Setting |
---|---|
Language level | Lua 5.2 |
Type Safety
Name | Setting |
---|---|
Strict nil checks | ☑️ |
Unknown type (any) is indexable | ☑️ |
Unknown type (any) is callabale | ☑️ |
What are the steps to reproduce this issue?
Create a file with the following code:
---@alias MyArray<T> table<number,T>
---@generic Y
---@param arg MyArray<Y>
local function f(arg) end
---@type MyArray<string>
local arg = {"foobar"}
f(arg)
What happens?
Error on line f(arg)
: Type mismatch. Required: 'MyArray<number>' Found: 'MyArray<string>'
What were you expecting to happen?
No error: this is a valid call to f()
. The generic Y
type should be inferred as string
, but for some reasons Luanalysis tries to infer it as number
.
Any logs, error output, etc?
None
Any other comments?
The problem is related to aliases. Luanalysis properly infer the generic Y
if f
is declared as follows:
---@generic Y
---@param arg table<number,Y>
local function f(arg) end