HDF Server Tag

The HDF Server allows producers of complex datasets to share their results with a wide audience base. We used it to develop the Global Fire Emissions Database (GFED) Analysis Tool, a website which guides the user through our dataset. A simple webmap interface allows users to select an area of interest and produce data visualization charts. ...

The HDF Group’s HDF Server has been nominated for Best Use of HPC in the Cloud, and Best HPC Software Product or Technology in HPCWire’s 2016  Readers’ Choice Awards. HDF Server is a Python-based web service that enables full read/write web access to HDF data – it can be used to send and receive HDF5 data using an HTTP-based REST interface. While HDF5 provides powerful scalability and speed for complex datasets of all sizes, many HDF5 data sets used in HPC environments are extremely large and cannot easily be downloaded or moved across the internet to access data on an as-needed basis.  Users often only need to access a small subset of the data.  Using HDF Server, data can be kept in one...

John Readey, The HDF Group

Editor’s Note: Since this post was written in 2015, The HDF Group has developed the Highly Scalable Data Service (HSDS) which addresses the challenges of adapting large scale array-based computing to the cloud and object storage while intelligently handling the full data management life cycle. Learn more about HSDS and The HDF Group’s related services.

HDF Server is a new product from The HDF Group which enables HDF5 resources to be accessed and modified using Hypertext Transfer Protocol (HTTP).

HDF Server [1], released in February 2015, was first developed as a proof of concept that enabled remote access to HDF5 content using a RESTful API.  HDF Server version 0.1.0 wasn’t yet intended for use in a production environment since it didn’t initially provide a set of security features and controls.  Following its successful debut, The HDF Group incorporated additional planned features.  The newest version of HDF Server provides exciting capabilities for accessing HDF5 data in an easy and secure way.

John Readey, The HDF Group

We’re pleased to announce that The HDF Group is now a member of the Open Commons Consortium (formerly Open Cloud Consortium), a not for profit that manages and operates cloud computing and data commons infrastructure to support scientific, medical, health care and environmental research.

The HDF Group will be participating in the NOAA Data Alliance Working Group (WG) on the WG committee that will determine the datasets to be hosted in the NOAA data commons as well as tools to be used in the computational ecosystem surrounding the NOAA data commons.

OSDC website

“The Open Commons Consortium (OCC) is a truly innovative concept for supporting scientific computing,” said Mike Folk, The HDF Group’s President. “Their cloud computing and data commons infrastructure supports a wide range of research, and OCC’s membership spans government, academia, and the private sector.  This is a good opportunity for us to learn about how we can best serve these communities.”

The HDF Group will also participate in the Open Science Data Cloud working group and receive resource allocations on the OSDC Griffin resource.  The HDF Group’s John Readey is working with the OCC and others to investigate ways to use Griffin effectively.  Readey says, “Griffin is a great testbed for cloud-based systems.  With access to object storage (using the AWS/S3 api) and the ability to programmatically create VM’s, we will explore new methods for the analysis of scientific datasets.” 

Lindsay Powers, The HDF Group

The 2015 HDF workshop held during the ESIP Summer Meeting was a great success thanks to more than 40 participants throughout the four sessions.  The workshop was an excellent opportunity for us to interact with HDF community members to better understand their needs and introduce them to new technologies. You can view the slide presentations from the workshop here.

From my perspective, the highlight of the workshop was the Vendors and Tools Session where we heard from Ellen Johnson (Mathworks), Christine White (Esri), Brian Tisdale (NASA), and Gerd Heber (The HDF Group) talk about new, and improved applications of HDF technologies.  For example:  

HDF Group is hosting a one-day workshop at the upcoming Federation for Earth Science Information Partners (ESIP) Summer Meeting in Asilomar, CA on July 14th. Please join us to learn about new HDF tools, projects and perspectives. There will also be an HDF Town Hall meeting on Wednesday afternoon July 15th...

* With Python’s Help

Gerd Heber, The HDF Group

Before the recent release of our PyHexad Excel add-in for HDF5[1], the title might have sounded like the slogan of a global coffee and baked goods chain. That was then. Today, it is an expression of hope for the spreadsheet users who run this country and who either felt neglected by the HDF5 community or who might suffer from a medical condition known as data-bulging workbook stress disorder. In this article, I would like to give you a quick overview of the novel PyHexad therapy and invite you to get involved (after consulting with your doctor).

To access the data in HDF5 files from Excel is a frontrunner among the all-time TOP 10 most frequently asked for features. A spreadsheet tool might be a convenient window into, and user interface for, certain data stored in HDF5 files. Such a tool could help overcome Excel storage and performance limitations, and allow data to be freely “shuttled” between worksheets and HDF5 data containers. PyHexad ([4],[5],[6],[7]) is an attempt to further explore this concept.  

HDF Group has just announced “HDF Server” - a freely available service that enables remote access to HDF5 content using a RESTful API. In our scenario, using HDF Server, we upload our Monopoly simulation results to the server and then interested parties can make requests for any desired content to the server - no file size issues, no downloading entire files...