Version 6 (modified by daley, 10 years ago) (diff)

email from klaus

FLASH Performance Experiment Notes

WD_def problem

We generated a weak scaling curve (WD_weakscaling.pdf) on BG/P using data from 4416, 6144, 8192 processor runs:

Updated numbers, all tests run with updated code, with Paramesh4dev:

Nblocks(ini) Nprocs t_evo  tref         max#blk/proc mem/proc[GiB]
110387       4416   573.8  32.5         29              .5
110387       8192   374.1  24.8         17              .5
154419       6144   609.8  47.5         29              .5
203443       8192   633.9  55.2         29              .5

t_evo is the total evolution time
Rows 1,3,4 constitute the weak scaling curve to be.
Rows 1,2 test strong scaling.

Setup syntax: ./setup WD_def -3d +cube16 -maxblocks=70 +noio -auto +pm4dev -objdir=PM4dev -parfile=[specific flash.par]
Here, "specific flash.par" is:
scaling5_intrepid.par - Row 1 (4416 procs) & Row 2 (8192 procs) in table.
scaling5h_intrepid.par - Row 3 (6144 procs) in table.
scaling6_intrepid.par - Row 4 (8192 procs) in table.

For the weak scaling we attempt to keep the number of blocks per processor constant as we vary the number of processors.

The way to experiment is to use the setup as is, and change the lrefine_max in flash.par to get to the next problem size. Keep adjusting the number of processors until you get roughly the same number of blocks per processor. You may have less of a problem doing this on XT4 since there is lots more memory per processor. Also, eliminate IO from the runs, that should give you more memory to play around in.

However, it seems for WD_def we do not vary lrefine_max. Chris needs to speak to his work colleague to understand this more thoroughly. For the time being here is the email describing the weak scaling for WD_Def:

On Wed, 1 Oct 2008, Chris Daley wrote:

> Also can you please explain why we change:
> r_match
> refine_uni_radius
> in each flash.par?

THAT is how the size of the problem (i.e., muber of blocks) was
changed.  Note the lrefine_min and lrefine_max do not change!

> How do you know the correct value?

It required quite a bit of work to come up with these values.

They were chosen to give problem sizes of (roughly) 100 * 64 * 2^n
total blocks, for integer n, n is the number in scaling<n> I believe.