The HDF Group is an Equal Opportunity Employer encouraging applications from veterans, minorities, females, and individuals with disabilities.

If you are interested in working at The HDF Group and do not see a job posted that is a fit for you, please submit a resume, cover letter, and references to In your email, please note that you are not applying for a specific open position.

Software Engineer

Job summary: The Software Engineer will be part of a team that develops and supports Hierarchical Data Format (HDF) technologies and will work on a project “Hermes: Extending the HDF Library to Support Intelligent I/O Buffering for Deep Memory and Storage Hierarchy Systems” performed in collaboration with IIT. The successful applicant will also participate in development and maintenance of HDF5 software. The applicant for this position should have interest in HPC, deep memory hierarchies, storage, data workflows, API design and implementation, performance optimization, problem solving, and must be comfortable working with other team members and collaborating with scientists and application developers that use HPC systems. Experience with C and MPI is required. Experience with all aspects of the software life cycle is preferred. Travel to client sites, workshops, and conferences may be required. Read more.

High Performance Computing Architect

Job summary: As the HPC Architect, you will play a critical role in the development of open source and commercial applications, as well as consulting engagements with clients that rely on our distinct mix of expertise and experience. You will be providing the vision and direction that will allow us to evolve in the High Performance Computing segment. High Performance Computing is a large segment of our business and you will drive innovation to meet the needs of this segment of the market. Read more.

High Performance Computing (HPC) Software Developer

Job summary: The HPC Software Developer will develop software for the Hierarchical Data Format v5 (HDF5) library and tool suite. Responsibilities will include enhancing the HDF5 library with additional features such as: sophisticated caching techniques, asynchronous file I/O, self­-tuning storage optimizations, advanced multi­thread/multi-process/multi-client file access techniques, cluster and parallel file system interaction optimizations to deliver the highest performance possible to users of HDF5. Parallel and distributed I/O in high performance computing environments using MPI and MPI­IO will be the primary focus of this position. Interest and experience with project management is preferred. Some travel to client sites and to workshops and conferences may be required. Read more.