Data Warehouse Engineer
Rafter - San Mateo, CA

This job posting is no longer available on Rafter. Find similar jobs:Data Warehouse Engineer jobs - Rafter jobs

Rafter is looking for a Data Warehouse Engineer to extend and maintain our highly valued data warehouse and reporting environment. We are a fast paced technology company aiming to disrupt established markets in the higher education space. Our core-engineering group is a Ruby shop that also uses myriad of new technologies. DW engineer, as part of the engineering team, have the opportunities to leverage core engineering tools as well as the latest innovations in Big Data space such as MPP (Greenplum/Redshift) databases and Hadoop clusters. We are always evaluating new platforms for building a more scalable and stable DW platform, which allows the DW engineer to experience the latest in DW and BI space.

To be successful in this role, you should have a keen interest in big data and web semantics and the drive to create new solutions. Ability to wear multiple hats and learn new tools/technologies is norm and expected. This is an ideal role to leverage your interest in knowledge manipulation, classification schemes and quantification, and build systems that no current commercial product can deliver.

Primary Responsibilities:
  • Technical design, implementation and maintenance of the conforming dimensions and conforming facts in the data warehouse
  • Continually ensure completeness and compatibility of the technical infrastructure and DW implementation to support system performance, availability and architecture requirements
  • Monitors and maintains data quality including implementing a change data capture mechanism, a strategy for correcting data errors, managing an audit dimension, etc.
  • Integrate OLTP sources (MySQL) into the analysis environment
  • Implement a streaming orientation from OLTP and clickstream sources to reduce data latency
  • Merge call center, operations, CRM and other data sources into the analysis environment
  • Develop and maintain Pentaho Kettle transforms and jobs
  • Develop and maintain bulk imports into the Greenplum analytic database
  • Develop automation for continuous validation of data quality
  • Maintain alerting and triage system for data quality failures

Minimum Requirements:
  • 5 years experience working extensively with SQL, including the ability to write, analyze, debug queries and execute advanced OLAP functions.
  • Proficiency in any/all of the following scripting languages (Ruby, Unix Shell, Python, Perl)
  • Experience with any/all of the following technologies: PostgreSQL, Greenplum, Vertica, Netezza
  • Strong technical understanding of data modeling, design and architecture principles and techniques across master data, transaction data and derived/analytic data; strong technical expertise in data storage/management technologies for both OLTP and DW solutions
  • A Bachelor’s degree in computer science or a related field. Specialization in data management preferred.
  • 2+ years of experience in enterprise data warehouse architecture principles, methods, techniques and technologies and three years experience with data information management-related components of large, complex data solutions.

Re ad more about our success.

Rafter Experiences 1,400 Percent Annual Revenue Growth in On-Campus Commerce -

To learn more about Rafter , visit us on the web at .

Rafter is proud to be an equal opportunity workplace and is an affirmative action employer. “

Veterans of the United States Armed Forces are encouraged to apply. Thank you for your service.”