Blog

The HDF 2015 Workshop at the ESIP Summer Meeting

Lindsay Powers, The HDF Group

The 2015 HDF workshop held during the ESIP Summer Meeting was a great success thanks to more than 40 participants throughout the four sessions.  The workshop was an excellent opportunity for us to interact with HDF community members to better understand their needs and introduce them to new technologies. You can view the slide presentations from the workshop here.

From my perspective, the highlight of the workshop was the Vendors and Tools Session where we heard from Ellen Johnson (Mathworks), Christine White (Esri), Brian Tisdale (NASA), and Gerd Heber (The HDF Group) talk about new, and improved applications of HDF technologies.  For example:  

  • Mathworks is increasing their support for HDF5 in MATLAB including: adding the capacity to read HDF5 data with dynamically loaded compression filters, increasing options for handling big data, and providing RESTful web service access.  Ellen Johnson provided an exciting demo using the new HDF Server API to pull data from the CoRTAD database into MATLAB to show a time series visualization of coral reef temperature anomalies.
  • Esri has added functionality to ArcGIS, which now reads HDF directly as a raster layer, allowing the creation of multidimensional mosaic datasets.
  • Brian Tisdale from NASA Atmospheric Science Data Center (ASDC) discussed how NASA is implementing the new ArcGIS raster capabilities (among others) in the GDAL library to improve access to raster geospatial data.

Equally exciting were the presentations on new tools and capabilities by members of The HDF Group.

  • Gerd Heber talked about Spark as a highly versatile and fast processing engine for interfacing between HDF5 and many downstream tools (read Gerd’s blog post).
  • Aleksandar Jelenak presented a new tool that facilitates the simple creation of interoperable and standards-compliant data products in a collaborative environment.  The tool can run on multiple computing platforms without the need for full suites of development tools and libraries, and multiple file types can be imported and exported.
  • Joel Plutchak discussed the recent and future advances by the HDF Community to develop indexing technologies for HDF files.
  • John Readey presented the new HDF Server (read John’s blog article), a REST-based API that provides read/write and full data type support as well as chunking/compression and hyperslab/point selection.
  • Gerd also presented on two new projects that will facilitate the use of HDF technologies by easing the entry point for transforming data from other conventions into HDF formats.

The workshop was a wonderful opportunity for us to share our current work but, more importantly, to interact with the HDF Earth Science community.

We are very interested in developing the HDF user community to share knowledge and resources across the diverse suite of existing and future users. The HDF Group is developing new technologies that promote the seamless integration of community standards into data workflows, because we believe that the integration of community standards and conventions into tools promotes broad adoption and sustainability of standards.

We hope that these versatile new tools meet the evolving needs of diverse user communities and will provide new modes of sharing information. The workshop venue offers us an opportunity to better understand the value of our work to the community, and respond to community needs.  Many thanks to those who participated, and we hope to see even more of you next year.

The slide presentations from the workshop may be found on the HDF-EOS website.  Our speakers welcome your questions and ideas – comments on this blog will be directed to them.

No Comments

Leave a Comment