(Job Description PDF)
The Software Engineer will be part of a team that develops and supports Hierarchical Data Format (HDF) technologies and will work on a project “Hermes: Extending the HDF Library to Support Intelligent I/O Buffering for Deep Memory and Storage Hierarchy Systems” performed in collaboration with IIT. The applicant for this position will work in the design and development of a data streaming library for Hermes. The applicant should have interest in HPC, deep memory hierarchies, data streaming technologies, storage systems, data workflows, API design and implementation, performance optimization, problem solving, and must be comfortable working with other team members and collaborating with scientists and application developers that use HPC systems. Experience with C and MPI is required. Experience with all aspects of the software life cycle is preferred. Travel to client sites, workshops, and conferences may be required.
Essential job functions and key responsibilities
- Participate in design and development of Hermes, a new, intelligent, heterogeneous-aware, multi-tiered, dynamic, and distributed I/O buffering system.
- Design and develop a data streaming library for usage on top of Hermes.
- Explore optimizations for Hermes related to the new library.
- Fix software bugs in the HDF5 library relevant to the Hermes work.
- Write technical documentation.
- Experiment with new technologies relevant to the area of development; recommend improvements to techniques, procedures or other aspects of technical development.
- Provide input for preventing future problems as well as incorporating solutions to current concerns.
- Attend technical conferences as requested.
- Work on assigned projects under supervision of Senior Staff member.
- A Master’s degree is required, preferably in computer science or software engineering (Doctoral degree is preferred); experience and/or training may be considered depending on the nature and depth of the experience as it relates to current technologies.
- Excellent knowledge of C and C++ 11.
- Experience with developing production quality software.
- Good understanding of MPI concepts.
- Experience with running applications on HPC systems.
Knowledge, Skills, and Abilities required
- Strong theoretical background in data structures, computer architectures, compilers and algorithms.
- Ability to learn quickly new concepts and techniques.
- Ability to communicate clearly to all types of audiences – from the inexperienced to highly technical users.
- Strong organizational skills.
- Strong oral and written communication skills.
- Self-motivation and creativity.
- Problem solving and analytical skills necessary to carry out essential job functions and key responsibilities.
Knowledge, Skills, and Abilities preferred
- Experience with software development and maintenance.
- Experience with “agile” software development.
- Experience with file system design.
- Experience with Deep Memory and Storage Hierarchy.
- Experience with Data Streaming Frameworks such as Apache Flink or Apache Kafka.
- Experience with MPI and MPI I/O.
- Experience with software performance evaluation and enhancement.
- Experience working with HDF5 software.
- Experience using GNU autotools and CMake build systems.
The HDF Group is an Equal Opportunity Employer and has a strong commitment to diversity. In keeping with that commitment, individuals with disabilities, minorities, females, and veterans are encouraged to apply.
To ensure full consideration, please submit a resume, cover letter, salary history and references to HPC@hdfgroup.org.