Quantum Gravity Code Performance Results

 

Parallel MPI+Fortran code to study Path Integral Gravity


Machine

1 node

16 nodes

256 nodes

IBM SP2 at CTC

2407 (35)

157 (33)

27 (12)

New IBM SP2 at CTC (4/97)

873 (95)

69 (75)

--

TMC CM5 at NCSA

36340 (2)

--

142 (2)

Convex/HP SPP2000

1170 (71)

81 (64)

--

AENEAS (g77)

1350 (62)

83 (62)

--

AENEAS (Absoft f77)

1020 (83)

63 (81)

--

 

Wall-clock time in seconds, Mflops per processor in parenthesis.
(Mflop rates estimated from Cray C90 HPM).
 

Gravitational Monte Carlo Simulation code, Fortran 77 + MPIch

16x16x16x16 lattices, 983040 variables

 
Herbert W. Hamber 10/24/97