High Performance Computing Architect

Why The HDF Group?

We are a leader in the High Performance Computing and Engineering markets with a 30-year legacy of providing solutions to mission critical problems. Our software is used by a diverse, global and passionate group of users who want us to succeed and grow. The HDF Group provides the rare opportunity to work with a unique and impressive group of people who really care about making an impact.

The Job:

As the HPC Architect, you will play a critical role in the development of open source and commercial applications, as well as consulting engagements with clients that rely on our distinct mix of expertise and experience. You will be providing the vision and direction that will allow us to evolve in the High Performance Computing segment.
High Performance Computing is a large segment of our business and you will drive innovation to meet the needs of this segment of the market.

Key Responsibilities:

  • Responsible for the strategy, roadmap, vision, and feature definition of HDF5 to meet the needs of the HPC community.
  • Communicate the vision to all stakeholders.
  • Implement the vision with a highly skilled group of engineers.
  • Active involvement in the HPC community.
  • Developing proposals and responding to RFPs relating to High Performance Computing.
  • Provide expertise to the Sales and Marketing team relating to High Performance Computing.

Key requirements:

  • Knowledge of HDF5. It will be difficult to do this job without more than a casual knowledge of the technology.
  • An advanced knowledge of High Performance Computing.
  • Knowledge of big data.
  • Experience with HPC library specifications like MPI.
  • Strong written and verbal communication skills required. Expect frequent communication and coordination with stakeholders.
  • Success is likely with at least 5-10 years of experience in High Performance Computing.
  • Experience with research projects that most likely have resulted in publications.

The HDF Story:

The Hierarchical Data Format, or HDF, was originally developed in 1987 at the National Center for Supercomputing Applications (NCSA) at the University of Illinois at Urbana-Champaign, with the goal of addressing the growing need to effectively manage very large and complex scientific and engineering data. Organizations in commercial industry, government and academia adopted HDF for many applications demanding high performance data management software. HDF supports all types of digital data, regardless of origin, size or complexity, from remote sensing data collected by satellites and computational results from nuclear testing models, to high-resolution MRI brain scans and financial time series data.

To ensure full consideration, please submit a resume, cover letter, salary history and references to DE@hdfgroup.org.

The HDF Group is an Equal Opportunity Employer by choice. We encourage applications from veterans, minorities, females, and individuals with disabilities.