Our innovative big-data analytics platform client needs a contract Hadoop Engineer to collaborate with their Technical Writer in the creation of four (4) tutorials about Hadoop.
You’d work entirely offsite, on a very flexible schedule, to supplement the efforts of the client’s own technical marketing manager and Hadoop software engineer (SE), becoming the Technical Writer’s primary resource for defining the tutorials’ goals, use cases, and selected details that demonstrate the functionality of Hive, Pig, Mahout, and HBase, among others.
The tutorials are intended to be customer-run demos, to give prospective users ideas about what’s possible using Hadoop and encourage them to learn more. These tutorials will sync with an existing canned demo of the platform, highlighting selected functionality. The deadline for the first two tutorials is mid-March, and the second two are due at the end of March.
The ideal candidate for this role is a Java-literate engineer has installed and configured at least one Hadoop distribution and is organized, articulate, and comfortable experimenting with virtual machines, data analytics tools, and distributed networked file systems.
This will be part-time, offsite work lasting approximately two weeks. You’d be paid on a 1099 basis, unless you prefer W2.
- Hands-on experience implementing one or more Hadoop distributions
- Strong understanding of relational databases
- Solid working knowledge of big data technology – including Hadoop, MapReduce, Hive, Pig, Mahout, HBase, and related components
- Reachable references
To apply :
- Ability to identify relevant features, formulate use cases, and communicate accurately and clearly
Please visit contentrules.com/apply , citing “CHE-335″ in the Job Code field.
Note : Content Rules has many content-development opportunities with SF Bay Area technology companies. Visit www.contentrules.com/jobs for details.