As a Technical Support Engineer, you will provide support to our customers for diagnosing, reproducing, and fixing Hadoop related issues. You will troubleshoot the Hortonworks Data Platform in multiple types of environments and take ownership of problem isolation and resolution, and bug reporting. To be successful in this role, you must be a motivated self-starter, be committed to ongoing self-education, possess strong customer service skills and have excellent technical problem solving skills.
Resolve customer problems via telephone, email or remote access
Maintain customer loyalty through integrity and accountability
Research customer issues in a timely manner and follow up directly with the customer with recommendations and action plans
Escalate cases to management when customer satisfaction comes into question
Escalate cases to the engineering team when the problem is beyond the scope of technical support or falls out of the support team’s expertise
Maintain control and management of the overall resolution for any escalated case, even when cross-functional groups are involved
Leverage internal technical expertise, including development engineers, knowledge base, and other internal tools to provide the most effective solutions to customer issues
Create knowledge base content to capture new learning for re-use throughout the company and user base
Participate in technical communications within the team to share best practices and learn about new technologies and other ecosystem applications
Participate in the on-call rotation with other Technical Support Engineers
Actively participate in the Hadoop community to assist with generic support issues
Learn as much about Hadoop as you can!
A strong and enthusiastic commitment to resolving customer problems in a high quality and timely manner
Support/troubleshooting experience in one or more of the following areas:
o Networking, Hadoop core, HBase, HIVE
o Linux and or Unix environments
o Scripting at the command line level for Linux
o Enterprise storage, databases or high-end server
o Virtualized environments such as ESX, Xen, KVM, or
AWS and EC2
o NAS and/or SAN
Ability to compile and install Linux applications from source
Coding/Scripting experience in Python, Perl, and/or Java is a plus
Distributed file system exp
Bachelor's degree or equivalent experience
Enthusiastic about Big Data and the Hadoop ecosystem.
Good written and verbal communication skills with a strong aptitude for learning new technologies and understanding how to utilize them in a customer facing environment
Excellent interpersonal skills with the ability to maintain and be in control of customers under all circumstances. Grace under pressure - must be able to deal with difficult customer situations with professionalism
High energy, high integrity, modest demeanor with customers a must
Hortonworks is a leading commercial vendor of Apache Hadoop, the preeminent open source platform for storing, managing and analyzing big data. Our distribution, Hortonworks Data Platformpowered by Apache Hadoop, provides an open and stable foundation for enterprises and a growing ecosystem to build and deploy big data solutions. Hortonworks is the trusted source for information on Hadoop, and together with the Apache community, Hortonworks is making Hadoop more robust and easier to install, manage and use. Hortonworks provides unmatched technical support, training and certification programs for enterprises, systems integrators, and technology vendors. For more information, visit www.hortonworks.com.
Hortonworks - 2 years ago