Blog

The HDFView 2.14 and HDF Java 3.3.2 release is now available. This release supports HDF5-1.8 and 32-bit object identifiers and was tested with HDF5-1.8.19 and HDF 4.2.13. It can be obtained from: https://support.hdfgroup.org/downloads/index.html It can also be obtained directly from: https://support.hdfgroup.org/products/java/release/download.html More information on this release can be found on the HDF Java home page. This is a maintenance release with a few bug fixes. It includes changes to the error exception handling in the Java HDF wrappers. These changes fix an issue that affected the display of HDF4 files in HDFView....

CONTENTS Release of HDF 4.2.13 Release of HDF 4.2.13 The HDF 4.2.13 release is now available. It can be obtained from the HDF4 home page: https://support.hdfgroup.org/products/hdf4/ HDF 4.2.13 is a minor release with a few changes, including: Support was added for macOS Sierra 10.12.5. Several memory leaks were fixed. The minimum CMake version supported is 3.2.2. For detailed information regarding this release, please see the Release Notes....

The HDF5 1.8.19 release is now available. It can be obtained from the HDF5 Download page: https://www.hdfgroup.org/downloads/hdf5/ It can also be obtained directly from: https://support.hdfgroup.org/HDF5/release/obtain518.html HDF5 1.8.19 is a minor release with a few new features and changes. Important changes include: Several H5PL (C) APIs were added to manipulate the entries of the plugin path table: H5PLappend, H5PLget, H5PLinsert, H5PLprepend, H5PLremove, H5PLreplace, and H5PLsize. H5Dget_chunk_storage_size (C) was added to obtain the storage size of a chunk in the file. This API was specifically added in support of H5DOread_chunk, but may also be useful for other purposes. H5DOread_chunk (High Level C) was added to read a raw data chunk directly from a dataset in a file, bypassing HDF5's internal data transfer pipeline, including filters. ...

Tobias Weinzierl, Durham University, UK, Sven Köppel, FIAS, Germany, Michael Bader, TUM, Germany, HDF Guest Bloggers ExaHyPE develops a solver engine for hyperbolic differential equations solved on adaptive Cartesian meshes. It supports various HDF5 output formats. Exascale computing is expected to allow scientists and engineers to simulate, and ultimately understand, wave phenomena with unprecedented accuracy over unprecedented time spans. To harvest the power of exascale machines, well-suited software however has to become available. ExaHyPE is a H2020 project writing a PDE solver engine—similar to a 3D computer game engine—that will allow groups with a decent CSE expertise to write their own solver for hyperbolic equation systems within a year. The resulting solver will scale to exascale. This is made possible by the unique...

Internal compression is one of several powerful HDF5 features that distinguish HDF5 from other binary formats and make it very attractive for storing and organizing data. Internal HDF5 compression saves storage space and I/O bandwidth and allows efficient partial access to data. Chunk storage must be used when HDF5 compression is enabled....

Scot Martin, Harvard University, HDF Guest Blogger HDF5 storage is really interesting. To me, its format has no fixed structure, but instead is based on introspection and discovery. Seems great to me; Mathematica has its origins first in artificial intelligence, so we ought to be able to do something here.  Approaching twenty-two years with Mathematica and almost a “Hello, World!” ability in C, I decided to jump right in. Enter The HDF Group's P/Invoke for my salvation. Here’s how we make use of it in Mathematica: LoadNETAssembly["HDF.PInvoke.dll"] Bang! Ready to go in Mathematica. Here’s a proof of concept for how it works: Module[ (* The three symbols should have initial values so that there is *) (* memory allocation when Mathematica interfaces with P/Invoke. *) {major=0,minor=0,revision=0,return}, CompoundExpression[ (* access...

Christian Hoene, Symonics GmbH; and Piotr Majdak, Acoustics Research Institute; HDF Guest Bloggers Spatial audio - 3D sound.  Back in the ‘70’s, “dummy head” microphones were used to create spatial audio recordings. With headphones, one was able to listen to those recordings and marvel at the impressive spatial distribution of sounds – just like in real life. [caption id="attachment_11132" align="aligncenter" width="624"] Displays the difference between listening to a real source and listening to realistic virtual sounds via headphones[/caption] Nowadays, we have a much better understanding of the human binaural perception and we can even simulate spatial audio signals with the help of computers.  Indeed, a modern virtual reality (VR) headset such as the Oculus Rift or Samsung Gear utilizes 3D audio to allow...

The HDF Server allows producers of complex datasets to share their results with a wide audience base. We used it to develop the Global Fire Emissions Database (GFED) Analysis Tool, a website which guides the user through our dataset. A simple webmap interface allows users to select an area of interest and produce data visualization charts. ...