benchpark
benchpark copied to clipboard
missing fflags in stream benchmark definition?
https://github.com/LLNL/benchpark/blob/b3e09e637a8043f2fb051b83de3e0b2af67f4f05/experiments/stream/openmp/ramble.yaml#L44
stream makefile has CFLAGS and FFLAGS and the performance-relevant code is in fortran files, so I would assume we need fflags here too (unless spack populates the fflags with a copy of the cflags ¯\(ツ)/¯ )
I tried adding fflags (via cflags="-mcmodel=large" fflags="-mcmodel=large"
) but this fails with:
==> Defining Spack variables
==>
==> *******************************************
==> ********** Running Spack Command **********
==> ** command: /vol0005/mdt3/data/ra000020/u10016/benchpark.fj/test.fj/spack/bin/spack
==> ** with args: ['find', '--format={name}', '[email protected]', '+openmp', 'stream_array_size=80000000', 'ntimes=20', 'cflags="-mcmodel=large"', 'fflags="-mcmodel=large"', '%[email protected]']
==> *******************************************
==>
==> Error: No package matches the query: [email protected]%[email protected] cflags='"-mcmodel=large"' fflags='"-mcmodel=large"' +openmp ntimes=20 stream_array_size=80000000
==> Error: Command exited with status 1:
'/vol0005/mdt3/data/ra000020/u10016/benchpark.fj/test.fj/spack/bin/spack' 'find' '--format={name}' '[email protected]' '+openmp' 'stream_array_size=80000000' 'ntimes=20' 'cflags="-mcmodel=large"' 'fflags="-mcmodel=large"' '%[email protected]'
Dropping the " from the line ( -> cflags=-mcmodel=large fflags=-mcmodel=large
) compiles successfully which could be related to https://github.com/GoogleCloudPlatform/ramble/issues/436 or something else ¯\(ツ)/¯
I think this is actually missing from the Spack stream
package definition (I see it does not set -mcmodel
as a Fortran flag).
unless spack populates the fflags with a copy of the cflags
It does not
Dropping the " from the line ( -> cflags=-mcmodel=large fflags=-mcmodel=large ) compiles successfully which could be related to https://github.com/GoogleCloudPlatform/ramble/issues/436
It looks like that issue was closed as fixed 2 days ago - can you confirm if that resolves this secondary issue you mentioned?