HDF5 and The Big Science of Nuclear Stockpile Stewardship
The article leads with, “In the quarter century since the US last exploded a nuclear weapon, an extensive research enterprise has maintained the resources and know-how needed to preserve confidence in the country’s stockpile.” It goes on to give the history of how the US Department of Energy (DOE) and its Los Alamos, Sandia and Lawrence Livermore national laboratories pioneered the use of high-performance computing to use computer simulation as a replacement for the actual building and testing of the USA’s nuclear weapons stockpile.
Although HDF5 is not named in this article, the history of The HDF Group and HDF5 are closely linked to this larger story of American science and geopolitics. In 1993, DOE determined that its computing capabilities would require massive improvements, as the article says, to “ramp up computation speeds by a factor of 10,000 over the highest performing computers at the time, equivalent to a factor of 1 million over computers routinely used for nuclear calculations… To meet the [ten-year] goal, the DOE laboratories had to engage the computer industry in massively parallel processing, a technology that was just becoming available, to develop not just new hardware but new software and visualization techniques.”
As a part of this larger national nuclear program and “ramp up” of computation speeds, the DOE provided the key funding in the late 1990s that allowed the then-standard HDF4 to evolve into the much-enhanced, very powerful HDF5 technology that is used today. The article continues, “…The program met its milestones and now exceeds its 10-year goal for computing speed (100 teraflops) by a factor of 200. Meanwhile, massively parallel processing has become the global technology standard for high-performance computing and an integral part of stockpile stewardship and other DOE laboratory programs.”
DOE has continued to partner with The HDF Group, supporting development of HDF5 through two generations of computing; sponsoring this development has benefited the entire HDF5 user community. Today, DOE supports current HDF5 R&D to ensure that the data challenges of third generation exascale computing can be handled in the coming decade.
1The Big Science of stockpile stewardship
Victor H. Reis, Robert J. Hanrahan, and W. Kirk Levedahl
Citation: Physics Today 69(8), 46 (2016); doi: 10.1063/PT.3.3268
View online: http://dx.doi.org/10.1063/PT.3.3268
View Table of Contents: http://scitation.aip.org/content/aip/magazine/physicstoday/69/8?ver=pdfcov
Published by AIP Publishing
Link: Trillion Particle Simulation: Finding One in a Trillion
Link: Exascale Computing: The HDF Group’s HPC Program blog
Video by the HDF Group’s founding CEO Mike Folk: https://vimeo.com/37996462 starting at (00:28:30)