Share this Job

Principal DBA - Hadoop Infrastructure

Date Posted: Jul 1, 2018

Location: Cary, NC, US, 27513

Company: MetLife

Job Location: Cary, NC

 

Role Value Proposition: 

MetLife is seeking a Principal Hadoop Administrator to assist in managing a large-scale 7x24x365 Hadoop cluster environment.  The incumbent will manage multiple projects and report on them to senior management. Will also be responsible for representing the Big Data team with regard to new technology and will document and recommend new directions. This individual will work closely with our application development teams to design, develop and implement new enhancements and database applications and provide support in development and production environments. The ideal candidate will be technically savvy, with excellent communication skills and possess the ability to build relationships and collaborate effectively with business and technology partners. If you have a passion for technology with a proven track record of implementing new technology apply today.

    

Key Responsibilities:

  • The candidate will be responsible for ensuring the continuous availability of several Hadoop and NoSQL clusters- evaluating and operationalizing new technologies, automation, and innovating new ways to manage database infrastructure.
  • Manage and maintain servers across multiple environments.
  • Advise development teams on clustering, indexing and other performance and architecture issues
  • Monitor deployments for capacity and performance
  • Define and implement backup strategies per data retention requirements
  • Administer NoSQL databases to achieve 100% monthly availability
  • Solve production problems when needed 24x7 Develop and document best practices for data migration (serve in on call rotation)

 

Essential Business Experience and Technical Skills:

Required:

  • 8+ years of related experience managing large scale production databases utilizing one or more of the following (Hadoop, MongoDB, HBase, SOLR)
  • Unix/Database Administration experience in Hadoop big data platform (BigInsights or Hortonworks)
  • Experience in Hadoop software installs and upgrades
  • Experience in SOLR and Hbase
  • Experience in working with Linux platform
  • Ability to setup HA
  • Good knowledge in Shell Scripting and BigSQL is a must
  • Strong in troubleshooting skills for performance issues and implementing the solutions
  • Needs to be good team player and work Independently

Preferred:

  • Bachelor's degree in Information Technology, Engineering, Computer science or related field
  • Mongo DB, Splunk

 

Business Category

Global Infrastructure Technology Operations

 

Number of Openings

1

 

At MetLife, we’re leading the global transformation of an industry we’ve long defined. United in purpose, diverse in perspective, we’re dedicated to making a difference in the lives of our customers.

 

 

MetLife is a proud equal opportunity/affirmative action employer committed to attracting, retaining, and maximizing the performance of a diverse and inclusive workforce. It is MetLife's policy to ensure equal employment opportunity without discrimination or harassment based on race, color, religion, sex (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender identity or expression, age, disability, national origin, marital or domestic/civil partnership status, genetic information, citizenship status, uniformed service member or veteran status, or any other characteristic protected by law.

MetLife maintains a drug-free workplace.

For immediate consideration, click the Apply Now button. You will be directed to complete an on-line profile. Upon completion, you will receive an automated confirmation email verifying you have successfully applied to the job.

Requisition #: 98266 


Nearest Major Market: Raleigh