We are looking for an experiences Big Data Admin to work for our Fortune 500 client in san Ramon, CA. An ideal candidate will have experience working on Hadoop clusters and good knowledge of BigData NoSQL databases. |
Some of the requirements for an ideal candidates for this position are:
* Experience with managing and monitoring large Hadoop clusters
* 3+ years of experience with writing software/shell scripts using
scripting languages including Perl, Python, or Ruby.
* 3+ years of experience with Hadoop Distributed File System (HDFS)
* Hands on experience with Apache Hadoop Ecosystem components such as
Pig, Hive, Scoop, Flume, and MapReduce/YARN etc.
* Good understanding of the core database concepts
* Experience with distributed, scalable databases such as Cassandra,
* Hands on experience with one or more ETL/ELT tools.
* Established experience with automated, elastic scaling of cloud
services, automated deployment and remediation
* High level of ownership and accountability
* Any Pivotal Hadoop/GreenPlum/GemFire experience is a plus.
Please apply with resume, mention best rate, availability & any other terms/constraints.