High Performance Computing Architect
Location: South Bend, IN
Who We Are:
Data Realty’s world-class data centers are designed to enable the analytics of the data they house. Through their data management platform and unique partnership model, Data Realty delivers mid-sized organizations unprecedented access to the advanced computing infrastructure, collaborative intellectual environments, and highly skilled technical talent that is usually reserved for their larger competitors.
This position directly supports clients and new business as a solution architect on existing infrastructure projects as well as opportunity capture efforts. The solution architect designs and implements the data management platforms that integrate multiple hardware, software, networking, and database technologies to meet client business, mission, and security requirements. The solution architect implements all applicable IT architecture principles, standards and guidelines in the data, integration, application, infrastructure, solutions, security, and technology domains.
Essential Job Functions:
Oversee all aspects of solution development, architecture, and management processes from concept ideation through development, launch, and maintenance. Success in this role requires teaming with delivery teams and account and executive management to develop and grow our portfolio of solutions and offerings and to increase company productivity and profits. The ideal candidate will also:
- Participate in pre and post sales process, helping both the sales and delivery teams to interpret client requirements and to analyze the state of client’s infrastructure and data operations. This includes participating in kickoff meetings and business requirements gathering meetings conducted with the client.
- Act as subject matter expert for Big Data related technology to address application integration and infrastructure framework-related questions. Ensure rapid response to client questions and potential project blockers.
- Collaborate with Project Managers to estimate cost involved in proposed solution design and to develop overall solution implementation plan.
- Write and produce functional and technical specification and/or reference documents of distributed architectures, configurations, and workflows for internal and/or client use.
- Follow knowledge management practice in capturing design notes, deliverables, methodologies, and solution development activities for traceability and troubleshooting purposes.
- Make recommendations for continuous design quality improvement purpose.
- Additional duties as assigned to develop scalable distributed computing environments and ensure client and company success.
- Bachelor’s of Science Degree in Computer Science, Computer Engineering, Mathematics, or related field. An advanced degree is highly desired.
- Hands-on experience building and managing large scale distributed data processing systems, including those handling very large datasets on clustered environments with a range of big data architectures, including OpenStack, Hadoop or other big data frameworks.
- Extensive experience in developing analytical applications and software environments including a strong background in data analytics, distributed computing, and large data sets management and analysis. Relevant areas may include Map/Reduce algorithms, NoSQL technologies, NoSQL data warehouse technologies, and deep graph analytics.
- Ability to learn quickly in a collaborative, fast-paced environment with a frequently demanding schedule.
- Excellent verbal and written communication skills.
Graham Allen Partners is a private company focused on the incubation of early-stage, high-growth technology businesses.
Located in South...