The focus of cosmology today revolves around two mysterious pillars, dark matter and dark energy. Large-scale sky surveys are the current drivers of precision cosmology and have been instrumental in making fundamental discoveries, with cosmic acceleration being the most recent. The next generation of surveys aims to establish a new regime of cosmic discovery by making fundamental measurements of the geometry of the universe and the growth of structure and by accurately characterizing key quantities such as the dark energy equation of state, the sum of neutrino masses, and the spatial statistics of the distribution of mass.
This project will provide a matching computational effort by running very large cosmological simulations—some of the most challenging to be undertaken worldwide.
We will use the hardware accelerated cosmology codes (HACC) computational cosmology framework to run two sets of simulations—one to build precision emulators to attack the problem of inference, the other to generate realistic synthetic sky catalogs and simulated observations—to help constrain a host of systematic uncertainties, by working closely with a number of leading surveys.
The HACC code can be tuned for performance and scaling on all HPC systems. In particular, on the IBM BG/Q system, HACC has reached very high levels of performance—almost 14 petaflops (the highest ever recorded by a science code) with scaling verified to about 1.5 M cores. Recently, excellent scaling of HACC on Titan has been demonstrated on more than 90 percent of the full machine at similar levels of performance.
In addition to HACC, the project rests on two other computational platforms: the Cosmic Calibration Framework, used to build precision prediction tools (or emulators) for survey observables; and pipelines to generate synthetic galaxy catalogs using halo occupation distribution models and semi-analytic models.