Technical Insights

Internal compression is one of several powerful HDF5 features that distinguish HDF5 from other binary formats and make it very attractive for storing and organizing data. Internal HDF5 compression saves storage space and I/O bandwidth and allows efficient partial access to data. Chunk storage must be used when HDF5 compression is enabled....

Scot Martin, Harvard University, HDF Guest Blogger HDF5 storage is really interesting. To me, its format has no fixed structure, but instead is based on introspection and discovery. Seems great to me; Mathematica has its origins first in artificial intelligence, so we ought to be able to do something here.  Approaching twenty-two years with Mathematica and almost a “Hello, World!” ability in C, I decided to jump right in. Enter The HDF Group's P/Invoke for my salvation. Here’s how we make use of it in Mathematica: LoadNETAssembly["HDF.PInvoke.dll"] Bang! Ready to go in Mathematica. Here’s a proof of concept for how it works: Module[ (* The three symbols should have initial values so that there is *) (* memory allocation when Mathematica interfaces with P/Invoke. *) {major=0,minor=0,revision=0,return}, CompoundExpression[ (* access...

Christian Hoene, Symonics GmbH; and Piotr Majdak, Acoustics Research Institute; HDF Guest Bloggers Spatial audio - 3D sound.  Back in the ‘70’s, “dummy head” microphones were used to create spatial audio recordings. With headphones, one was able to listen to those recordings and marvel at the impressive spatial distribution of sounds – just like in real life. [caption id="attachment_11132" align="aligncenter" width="624"] Displays the difference between listening to a real source and listening to realistic virtual sounds via headphones[/caption] Nowadays, we have a much better understanding of the human binaural perception and we can even simulate spatial audio signals with the help of computers.  Indeed, a modern virtual reality (VR) headset such as the Oculus Rift or Samsung Gear utilizes 3D audio to allow...

The HDF Server allows producers of complex datasets to share their results with a wide audience base. We used it to develop the Global Fire Emissions Database (GFED) Analysis Tool, a website which guides the user through our dataset. A simple webmap interface allows users to select an area of interest and produce data visualization charts. ...

Mark Miller, Lawrence Livermore National Laboratory, Guest Blogger The HDF5 library has supported the I/O requirements of HPC codes at Lawrence Livermore National Labs (LLNL) since the late 90’s. In particular, HDF5 used in the Multiple Independent File (MIF) parallel I/O paradigm has supported LLNL code’s scalable I/O requirements and has recently been gainfully used at scales as large as 1,000,000 parallel tasks. What is the MIF Parallel I/O Paradigm? In the MIF paradigm, a computational object (an array, a mesh, etc.) is decomposed into pieces and distributed, perhaps unevenly, over parallel tasks. For I/O, the tasks are organized into groups and each group writes one file using round-robin exclusive access for the tasks in the group. Writes within groups are serialized but...

Dave Pearah, The HDF Group How can users of open source technology ensure that the open source solutions they depend on every day don’t just survive, but thrive? While on my flight home from New York, I’m reflecting on The Trading Show, which focused on tech solutions for the small but influential world of proprietary and quantitative financial trading. I participated in a panel called “Sharing is Caring,” regarding the industry’s broad use of open source technology. The panel featured a mix of companies that both provide and use open source software. Among the topics: Are cost pressures the only driving force behind the open source movement among trading firms, hedge funds and banks? How will open source solutions shape the future of...

DOE has continued to partner with The HDF Group, supporting development of HDF5 through two generations of computing; sponsoring this development has benefited the entire HDF5 user community. Today, DOE supports current HDF5 R&D to ensure that the data challenges of third generation exascale computing ...

MuQun (Kent) Yang, The HDF Group

Many NASA HDF and HDF5 data products can be visualized via the Hyrax OPeNDAP server through Hyrax’s HDF4 and HDF5 handlers.  Now we’ve enhanced the HDF5 OPeNDAP handler so that SMAP level 1, level 3 and level 4 products can be displayed properly using popular visualization tools.

Organizations in both the public and private sectors use HDF to meet long term, mission-critical data management needs. For example, NASA’s Earth Observing System, the primary data repository for understanding global climate change, uses HDF.  Over the lifetime of the project, which began in 1999, NASA has stored 15 petabytes of satellite data in HDF which will be accessible by NASA data centers and NASA HDF end users for many years to come.

In a previous blog, we discussed the concept of using the Hyrax OPeNDAP web server to serve NASA HDF4 and HDF5 products.  Each year, The HDF Group has enhanced the HDF4 and HDF5 handlers that work within the Hyrax OPeNDAP framework to support all sorts of NASA HDF data products, making them interoperable with popular Earth Science tools such as NASA’s Panoply and UCAR’s IDV.  The Hyrax HDF4 and HDF5 handlers make data products display properly using popular visualization tools. 

We are excited and pleased to announce HDF5-1.10.0, the most powerful version of our flagship software ever.> This major new release of HDF5 is more powerful than ever before and packed with new capabilities that address important data challenges faced by our user community. HDF5 1.10.0 contains many important new features and changes, including those listed below. The features marked with * use new extensions to the HDF5 file format. The Single-Writer / Multiple-Reader or SWMR feature enables users to read data while concurrently writing it. * The virtual dataset (VDS) feature enables users to access data in a collection of HDF5 files as a single HDF5 dataset and to use the HDF5 APIs to work with that dataset. *   (NOTE:...