homepage backup 1-18-23


Learn about the open source foundation of The HDF Group

Scalable Cloud

Put the Highly Scalable Data Service to work for you


Get to know the options for free support and paid contracts

European HDF5 Users Group – May 31 – June 2, 2022

The 2022 European HDF5 Users Group (HUG), happened in-person and virtually on May 31-June 2, 2022 at ITER in Saint Paul-lez-Durance, France.

Call the Doctor – Weekly HDF Clinic

Tuesdays at 1:00 p.m. CDT: join us for a series of weekly, unscripted, live events! The HDF Group’s Gerd Heber will try to answer attendee questions and, for example, go over the previous week’s HDF Forum posts. The HDF Clinics are free sessions intended to help users tackle real-world HDF problems from a common cold to severe headaches and offer relief where that’s possible. As time permits, we will include how-tos, offer advice on tool usage, review your code samples, teach you survival in the documentation jungle, and discuss what’s new or just around the corner in the land of HDF.

What is HDF5®?

Heterogeneous Data

HDF® supports n-dimensional datasets and each element in the dataset may itself be a complex object.

Easy Sharing

HDF® is portable, with no vendor lock-in, and is a self-describing file format, meaning everything all data and metadata can be passed along in one file.

Cross Platform

HDF® is a software library that runs on a range of computational platforms, from laptops to massively parallel systems, and implements a high-level API with C, C++, Fortran 90, and Java interfaces. HDF has a large ecosystem with 700+ Github projects.

Fast I/O

HDF® is high-performance I/O with a rich set of integrated performance features that allow for access time and storage space optimizations.

Big Data

There is no limit on the number or size of data objects in the collection, giving great flexibility for big data.

Keep Metadata with Data

HDF5® allows you to keep the metadata with the data, streamlining data lifecycles and pipelines.

Our Advantage
Learn about how HDF can meet your big data needs.