oq-engine icon indicating copy to clipboard operation
oq-engine copied to clipboard

Investigations for conditioned scenarios (needed for the Aristotle project)

Open micheles opened this issue 9 months ago • 0 comments

The feasibility of generic conditioned scenarios must be assessed, because a naive approach would easily give numbers as 46.5 T (Terabytes) of memory used. The problem is that the memory occupation is QUARTIC with the maximum distance. Suppose maximum_distance=300 km (as it is now) and suppose the site model has a resolution of 1 km (can have even a higher resolution). The number of sites in a disk of radius 300 km is 3.14 * 300^2 = 282_600 sites. The size of the matrices tau and phi is

size = 10 gsims x 4 imts x 282_600^2 sites * 8 bytes * 2 matrices = 46.5 TB

with reasonable numbers for the number of GSIMs (we have more than 10 gsims per TRT in Europe) and IMTs. This can be solved with region_grid_spacing large enough, but how exactly do we determine it?

The other thing to assess is the dependency on the random part, i.e. do all simulations give basically the same GMF and we can set number_of_ground_motion_fields=1 or not? Is there a strong dependency on the ses_seed or not? Can we set truncation_level=0 and number_of_ground_motion_fields=1? In that case, can we parallelize by GSIM?

micheles avatar May 17 '24 04:05 micheles