Government, Defense, National Security
Why HDF Technologies?
HDF5 is highly scalable and can handle data objects of any size and number. Because the HDF5 data model is so flexible, it is also well-suited for highly heterogeneous data, which occurs more and more often as applications need to integrate a variety of different kinds of data from different sources. And the fact that HDF5 runs on virtually every computing platform means that HDF based software can easily be ported across projects, departments, and agencies.
The DOE labs maintain some of the world’s biggest and fastest computers, massively parallel machines that can generate a terabyte of data every minute. A critical need within the labs is to have data technologies that can accommodate the massive I/O speeds to keep up with the generation of data, and be able to store and archive that data in ways that permit effective analysis and visualization, as well as long-term archiving. HDF5 was first developed in the late 90s, primarily with support by the DOE labs, to meet precisely these challenges.
HDF can handle the growth in size and accretion rates of satellite imagery
Ability to accommodate virtually every kind of data in a single container, to show relationships among the data within and across containers, to encrypt data, as well as its ability efficiently to store and access very large traditional imagery and tabular data.
How HDF Technologies are Used
- Middleware for MPI/IO
- Data modeling applications
- Visualization and analysis tools using HDF5 data format
- Exascale computing
- Airborne, satellite, and other geospatial imagery such as high resolution satellite photography, LiDAR, increase the size and accretion rates of imagery, analysts must find data technologies that can accommodate such growth
- Structuring and processing a variety of new data sources such as social media, mobile sensors, audio, video, and other social information