HDF News

The HDF Group has been selected to receive a Department of Energy grant to develop a platform where data from different fusion devices is managed according to Findable, Interoperable, Accessible, and Reusable (FAIR) standards and UNESCO’s Open Science recommendations. The data will also be adapted for use with machine learning (ML) tools. Led by researchers at MIT, this collaborative project also includes Auburn University, William & Mary, and the University of Wisconsin-Madison....

Matthew Larson has joined The HDF Group as a software developer. Matthew is a recent graduate from the University of Illinois at Urbana-Champaign with a BS in Computer Science. While in school, Matthew held several internships where he worked in machine learning using AWS resources and data collection and served as a course associate for the Intro to Computer Science course. At The HDF Group, Matthew will be working closely with John Readey and Aleksander Jelenak on the Highly Scalable Data Service and other cloud technologies. Welcome Matt!...

HDF5 can be built using two build systems: the Autotools (since HDF5 1.0) and CMake (since HDF5 1.8.5). For a long time, the Autotools were better maintained and CMake was more of an "alternative" build system that we primarily used for handling Windows support (the legacy Visual Studio projects were removed in HDF5 1.8.11). This is no longer the case though—CMake support in HDF5 is (almost) as good as Autotools support and CMake, in general, is much more commonly used now than when we first introduced it. So why are we still hanging on to the legacy Autotools?...