HDF News

HDF5 can be built using two build systems: the Autotools (since HDF5 1.0) and CMake (since HDF5 1.8.5). For a long time, the Autotools were better maintained and CMake was more of an "alternative" build system that we primarily used for handling Windows support (the legacy Visual Studio projects were removed in HDF5 1.8.11). This is no longer the case though—CMake support in HDF5 is (almost) as good as Autotools support and CMake, in general, is much more commonly used now than when we first introduced it. So why are we still hanging on to the legacy Autotools?...

The Highly Scalable Data Service (HSDS) runs as a set of containers in Docker (or pods in Kubernetes) and like all things Docker, each container instance is created based on a container image file. Unlike say, a library binary, the container image includes all the dependent libraries needed for the container to run. In this blog post, HSDS senior architect John Readey explains how to get HSDS running in a Docker container or Kubernetes pod, and gives some tips and tricks to ensure everything runs smoothly for you. ...

Champaign, IL – The HDF Group announced its Board of Directors has appointed Gerd Heber as its new Executive Director.  The HDF Group is a software company dedicated to creating and supporting technologies to address many of today’s big data challenges.  Dr. Heber, who has been The HDF Group’s Applications Architect since 2010, replaces Mike Folk upon his retirement. Folk will remain a member of the Board of Directors.   ...

The HDF Group just released HDF5 1.13.1. All of the 1.13 series are experimental releases, which allows us to test new features with our users and get feedback while we are working on the development of the next major maintenance release. Learn more about this new release of HDF5....

The purpose of this introduction is to highlight and celebrate a community contribution the impact of which we are just beginning to understand. Its principal author, Mr. Lucas C. Villa Real, calls it HDF5-UDF and describes it as "a mechanism to generate HDF5 dataset values on-the-fly using user-defined functions (UDFs)." This matter- of-fact characterization is quite accurate, but I would like to provide some context for what this means for us users of HDF5....