At Pivotal, our mission is to enable customers to build a new class of applications, leveraging big and fast data, and doing all of this with the power of cloud independence. Uniting technology, people and programs from EMC and VMware, the following leading products and services are now part of Pivotal: Greenplum, Cloud Foundry, Spring, GemFire and other products from the VMware vFabric Suite, Cetas and Pivotal Labs.
Are you passionate about building great software products? Are you looking to work on the state of the art technology and big data?
Pivotal Hadoop engineering team is looking for world-class, fun-loving engineers to join our growing team.
You will be responsible for the design and development of Pivotal's industry-leading Big Data product built around the Apache Hadoop ecosystem. You are not only expected to gain a deep understanding of specific areas of the Hadoop stack (such as HDFS, Hive, HBase, etc), but to be able to understand the challenges and intricacies of deploying, monitoring, managing and optimizing really large scale distributed data systems such as Hadoop, with the goal of making it ready for various enterprise environments, either on bare-metal or in the cloud. You will work with end customer, sales and internal Pivotal driven requirements and translate them into highly scalable, robust software modules. You may also get an opportunity to contribute to the open source community.
"We are an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability status, protected veteran status, or any other characteristic protected by law."
- MS CS or equivalent required
- 5-10 years of industry software development experience
- Strong development experience in core Java and Linux environment.
- Deep experience with one scripting language – perl, python, ruby, etc.
- Experience developing large-scale system software.
- Deep understanding of big data, cloud computing, scalable and high performance environments
- Familiarity with entire product life cycle: design, implementation, testing, deployment and maintenance – especially in enterprise context
- Experience with Hadoop, distributed systems, HPC – a big plus.
- Good verbal and written communication skills
- Past experience working with globally distributed teams desirable
Pivotal Payments is an independent payment processing provider, offering a full range of merchant services to small, medium and large-sized...