Hadoop DBA
Logistics Integration Solutions - Albuquerque, NM

This job posting is no longer available on Indeed. Find similar jobs: Hadoop Database Administrator jobs - Logistics Integration Solutions jobs

Our organization is looking for an experienced Hadoop DBA for a 12+ months contract position in Albuquerque, NM. If interested please submit a current resume. Candidates must be US Citizen.

1. Job Title: Hadoop DBA - Candidates must be US Citizen

2. Location: Albuquerque, NM

3. Job Duration: 12+ months

4. Assignment Type: 1099

5. Pay Rate: Negotiable

6. Special Requirements: hadoop, cluster, Linux, nagios, ganglia, san, raid, traveling to customer sites

Job Description:

Hadoop Cluster Software Specialist to configure Hadoop Cluster software for Big Data processing. Hadoop Cluster solutions will be implemented throughout the U.S., but mostly in the western U.S. Responsible for integrating Hadoop software components, testing, and operation of Hadoop Clusters. Responsible for software compatibility and regression testing of software configuration changes. Candidate should have strong knowledge of Hadoop Cluster architectures and the ability to solve complex system configuration problems. This is a full time position. Candidate can be located anywhere in the U.S. Must be willing to travel as necessary to support customers and deliver projects.

You will be part of a team of software specialists, focusing on the implementation of supercomputer and big data processing systems.

Qualifications: Requirements

- At least one year experience implementing Hadoop Clusters; HDFS, YARN, MapReduce, etc.
- Strong understanding of Hadoop cluster architectures and concepts
- Strong technical knowledge of Linux operating systems and computer networking
- Experience with configuration of Hadoop cluster functions; Name Node, Job Tracker, Task Tracker, Data Node, etc.
- Experience with monitoring tools such as Nagios and Ganglia
- Experience with SAN and RAID storage configurations
- Must have effective verbal and writing skills
- Must be able to word effectively in small teams, as well as work independently
- Must be a U.S. Citizen and capable of holding a U.S. Government security clearance.

- BS in Computer Science, Computer Engineering or equivalent. Graduate degree preferred.

- Experience with Cloudera product suite
- Experience with Hadoop Management and Packaging Frameworks (Chef, Cobbler, Yum, Puppet, etc.)
- Experience with any or all of the following technologies; HBase, Hive, Pig, Autonomy IDOL
- Software Development skills in two or more of the following programming/scripting languages: Java, C++/C, Ruby

Must be a Road Warrior willing to Travel as there will be Multiple Sites, mostly in Western US.

Indeed - 19 months ago - save job