memtestG80 icon indicating copy to clipboard operation
memtestG80 copied to clipboard

cannot allocate more than 4095 mb

Open mor-o opened this issue 1 year ago • 0 comments

I am running a RTX 3090 with 24 Gigs of vram. Trying to allocate 4095mb gives the following error:

memtestG80.exe 4095 1
     -------------------------------------------------------------
     |                      MemtestG80 v1.00                     |
     |                                                           |
     | Usage: memtestG80 [flags] [MB GPU RAM to test] [# iters]  |
     |                                                           |
     | Defaults: GPU 0, 128MB RAM, 50 test iterations            |
     | Amount of tested RAM will be rounded up to nearest 2MB    |
     -------------------------------------------------------------

      Available flags:
        --gpu N ,-g N : run test on the Nth (from 0) CUDA GPU
        --license ,-l : show license terms for this build

Running 1 iterations of tests over 4096 MB of GPU memory on card 0: NVIDIA GeForce RTX 3090

Running memory bandwidth test over 20 iterations of 2048 MB transfers...
        Test failed!
Test iteration 1 (GPU 0, 4096 MiB): 0 errors so far
        Moving Inversions (ones and zeros): 4294967295 errors (125 ms)
        Memtest86 Walking 8-bit: 4294967288 errors (0 ms)
        True Walking zeros (8-bit): 4294967288 errors (0 ms)
        True Walking ones (8-bit): 4294967288 errors (0 ms)
        Moving Inversions (random): 4294967295 errors (0 ms)
        Memtest86 Walking zeros (32-bit): 4294967264 errors (0 ms)
        Memtest86 Walking ones (32-bit): 4294967264 errors (0 ms)
        Random blocks: 4294967295 errors (0 ms)
        Memtest86 Modulo-20: 4294967276 errors (0 ms)
        Logic (one iteration): 4294967295 errors (0 ms)
        Logic (4 iterations): 4294967295 errors (0 ms)
        Logic (shared memory, one iteration): 4294967295 errors (0 ms)
        Logic (shared-memory, 4 iterations): 4294967295 errors (0 ms)

Final error count after 1 iterations over 4096 MiB of GPU memory: 4294967181 errors

While trying to allocate one mb less (4094mb) works just fine:

./memtestG80.exe 4094 1
     -------------------------------------------------------------
     |                      MemtestG80 v1.00                     |
     |                                                           |
     | Usage: memtestG80 [flags] [MB GPU RAM to test] [# iters]  |
     |                                                           |
     | Defaults: GPU 0, 128MB RAM, 50 test iterations            |
     | Amount of tested RAM will be rounded up to nearest 2MB    |
     -------------------------------------------------------------

      Available flags:
        --gpu N ,-g N : run test on the Nth (from 0) CUDA GPU
        --license ,-l : show license terms for this build

Running 1 iterations of tests over 4094 MB of GPU memory on card 0: NVIDIA GeForce RTX 3090

Running memory bandwidth test over 20 iterations of 2047 MB transfers...
        Estimated bandwidth 401372.55 MB/s

Test iteration 1 (GPU 0, 4094 MiB): 0 errors so far
        Moving Inversions (ones and zeros): 0 errors (78 ms)
        Memtest86 Walking 8-bit: 0 errors (484 ms)
        True Walking zeros (8-bit): 0 errors (250 ms)
        True Walking ones (8-bit): 0 errors (250 ms)
        Moving Inversions (random): 0 errors (63 ms)
        Memtest86 Walking zeros (32-bit): 0 errors (1000 ms)
        Memtest86 Walking ones (32-bit): 0 errors (1031 ms)
        Random blocks: 0 errors (62 ms)
        Memtest86 Modulo-20: 0 errors (1578 ms)
        Logic (one iteration): 0 errors (32 ms)
        Logic (4 iterations): 0 errors (31 ms)
        Logic (shared memory, one iteration): 0 errors (31 ms)
        Logic (shared-memory, 4 iterations): 0 errors (31 ms)

Final error count after 1 iterations over 4094 MiB of GPU memory: 0 errors

Initially I though something is wrong with my GPU so I wrote a small python snippet allocating vram gradually with PyOpenGL and it succeeds in allocating the full 24Gb I have on the card. Any idea what's going on here?

mor-o avatar Mar 30 '23 19:03 mor-o