MoarVM icon indicating copy to clipboard operation
MoarVM copied to clipboard

MoarVM panics on arrays with bigint indices

Open sdomi opened this issue 9 months ago • 1 comments

Found this while playing around:

[0] > my $a;
[1] > $a[0xf00000000000001]="meow"
MoarVM panic: Memory allocation failed; could not allocate 8646911284551352336 bytes

Interestingly enough, if the bigint is large enough, the problem doesn't occur:

[1] > $a[0xf000000000000011]="meow"
Cannot unbox 64 bit wide bigint into native integer. Did you mix int and Int or literals?

I'm also getting this (quite expected, tbh) behavior on smaller ints:

[0] > my $a
(Any)
[1] > $a[0xffff]="a"
a
[2] > $a[0xffffffff]="a"
Killed

It would be nice if Moar could detect if it will get killed for allocating so many bytes (and either abort or allocate a sparse array behind the scenes)

sdomi avatar Mar 29 '25 16:03 sdomi

cf https://stackoverflow.com/a/54533852/1077672

eg:

But before discussing bigger arrays, running the code below on tio.run shows the practical limits for the default array type on that platform:

my @array;
@array[2**29]++; # works
@array[2**30]++; # could not allocate 8589967360 bytes
@array[2**60]++; # Unable to allocate ... 1152921504606846977 elements
@array[2**63]++; # Cannot unbox 64 bit wide bigint into native integer

The "could not allocate 8589967360 bytes" error is a MoarVM panic. It's a result of tio.run refusing a memory allocation request.

I think the "Unable to allocate ... elements" error is a raku level exception that's thrown as a result of exceeding some internal Rakudo implementation limit.

The last error message shows the indexing limit for the default array type even if vast amounts of memory were made available to programs.

raiph avatar Mar 29 '25 22:03 raiph