We're looking for someone who enjoys the challenges of big data to join our growing data engineering team, focusing on building scalable solutions for our high volume data. With 25 million users today, you'll get to work hands-on with plenty of exciting scale challenges as we grow to support millions more, across hundreds of servers with petabytes of storage in a hybrid dedicated/cloud environment.
Our cloud is the backbone of our service and provides constant support of mobile clients in the field, enabling them to receive real-time threat protection, backup and restore their data, find and protect their lost devices, and stay connected 24 hours a day. The vast majority of our users connect to the service on a daily basis. The performance and reliability of our cloud is paramount to our success.
This is a unique opportunity to join our data engineering team helping to architect and manage an array of solutions around our ever-growing big data.
Design, architect, and develop data engineering solutions for our big data challenges
Work closely with data analysts to construct creative solutions for their analytic tasks, working with them in defining and implementing metrics for our data driven product and business teams
Lead end-to-end efforts to design, develop, and implement data engineering, data warehousing, and business intelligence solutions
Engineer solutions within our growing ETL infrastructure
Develop reusable tools for the visualization, management, and manipulation of large data sets
Analyze and improve efficiency, scalability, and stability of data collection, storage, and retrieval processes
Optimize our infrastructure at both the software and hardware level
Support end users on ad hoc data usage and be a subject matter expert on functional side of the business
Solid engineering experience developing big data solutions for RDBMS based data warehousing and NoSQL environments
Experience with data warehousing architecture and data modeling best practices
In depth understanding of Entity Relationship and Dimensional modeling techniques
Expert understanding of ETL techniques and best practices to handle large volumes of data
Expert knowledge of SQL (MySQL preferred)
Strong development expertise in Ruby, Java, Python, or similar language
Strong knowledge managing, extracting, and loading data from source databases such as MySQL, Postgress, Oracle, or SQL Server
Solid experience working with data warehouse oriented systems such as Vertica, Netezza, GreenPlum, Teradata, Exadata, etc.
Expertise around database programming and performance tuning techniques
Big plus for experience with Hadoop, Hive, and Pig
Experience with MongoDB, Cassandra, CouchDB, or related NoSQL technologies
Knowledge of BI reporting solutions highly desirable