Share this Job

Data Management Director (Product Owner)

Apply now »

Date Posted: Aug 13, 2022

Location(s): Capital Federal, AR

Company: MetLife

With more than 150 years in the market and a presence in more than 60 countries, at MetLife we lead the global transformation of the Insurance industry. We are a collaborative community of more than 50,000 employees worldwide, united by purpose and dedicated to making a difference in the lives of our customers. ​

 

We are looking for a Data Management Director (IT Product Owner) to join our Center of Excellence in Buenos Aires

 

At LATAM Data Hub (LDH), our mission is to build the next generation data servicing platform in real-time for MetLife, and to help deploy it across various LATAM countries. We have developed a world-class, cloud-native, to enable digital channels such as online, mobile, call center and other emerging channels such as intelligent assistants to get data with low latency. The platform leverages cutting-edge, open source and proprietary technologies to create a highly configurable system that can be adapted to individual market needs quickly, and at a low cost. The platform runs in a fully containerized, elastic cloud environment, and is designed to scale to serve millions of users.

 

Driven by passion and purpose, we are looking for you, high performing data and analytics professional, to drive and support the ideation, development and deployment of actionable, high impact, data and analytics solutions in support of MetLife’s enterprise functions and lines of businesses across markets.

The portfolio of work delivers data driven solutions across key business functions such as customer acquisition, targeting, engagement & retention, investments, audit, risk management, claims service & operations and tackling hard, open-ended problems. The position requires a strong understanding of data engineering, data processing, extraction, transformation, loading and performance tuning in cloud or big data environment. 

 

You will work and collaborate with a nimble, autonomous, cross-functional team of makers, breakers, doers, and disruptors who love to solve real problems and meet real customer needs. You will be using cutting-edge technologies and frameworks to process data, create data pipelines and collaborate with the data science team to build end to end machine learning & AI solutions. You are passionate to learn new technologies on the fly and ship to production.

 

More than just a job we hire people who love what they do!

Responsibility

 

  • Provide Delivery Leadership to the cross functional team and deliver through use of new, innovative technologies, aligned to strategic architecture
  • Work with various country business stakeholders, gather requirements and work with local as well as shared resource teams to deliver the data hub and real time API consumption needs.
  • Drive design and delivery of the data flow for both storage and consumption layers; Hands on role, to ensure the design of the analytical, online data store and APIs is optimal.
  • Driving translating complex functional and technical requirements into detailed designs.
  • Driving writing high-performance, reliable, and maintainable code. Perform code reviews, enable controls and checkpoints to ensure high quality
  • Defining and implementing efficient operational processes
  • Own and drive key interactions with business and technology leadership and communicate plan, and progress, provide strategy and planning, develop solutions and innovate
  • Identify, document communicate and mitigate both delivery and enterprise risks
  • Train, coach, mentor and guide the team in a safe and highly motivating environment

        

Essential Business Experience and Technical Skills:

 

  • 10+ years of solutions development and delivery experience
  • 7+ years of technical leadership experience in delivery of enterprise scale technology programs
  • Significant experience in building and establishing (not just maintenance) of data lakes, warehouses, operational data stores, analytical and reporting data marts, transactional warehouses, Data as a service, ETL/ELT
  • Experience driving establishment of data lakes on cloud or on-prem Hadoop like clusters is a must
  • Technical skills
    • Hands on expertise in: Cloud data stores in in Azure platform using ADLS/ADF/Data Bricks/Delta Late and Cosmos DB or Experience implementing analytical data stores in other cloud infrastructure like Snowflake, AWS/Redshift, GCP Big Query, Google Cloud Storage, etc.
    • Building and Implementing data ingestion and curation process developed using Cloud data tools such as Spark(Scala/python), Data bricks, Delta lake or GCP Big Query etc.
    • Working knowledge of Kubernetes in a Azure or google cloud platform or similar platform is critical
    • Ingesting huge volumes data from various platforms for Reporting, Analytics, Data Supply, and Transactional (Operational data store and APIs) needs.
    • Working knowledge of deploying APIs to Kubernetes engine, setting up ingress controller and controlling the API service registry. Publishing the APIs to API gateway.
    • Strong SQL knowledge and data analysis skills for data anomaly detection and data quality assurance.
    • Code versioning experience using Bitbucket/AzDo. Working knowledge of AzDo pipelines would be a big plus.
      • Good scripting experience primarily on shell/bash/powershell would be really desirable.
      • Good experience in a cloud platform specifically Azure or GCP.
      • Broad knowledge across multiple technologies including big data based solutions, micro services, container and cloud based solutions and integration methodologies
    • Monitoring performance and advising any necessary infrastructure changes.
    • Expertise on graph databases(Janus/ Neo4j) and search indexing (Apache Solr/ Elastic Search) tools implementing different search/sort/optimization techniques for API performance or data delivery performance.
  • AGILE delivery experience managing multiple PODs, managing multiple upstream and downstream dependencies in standard frameworks such SAFe is critical; work closely with Scrum masters and other Agile coaches to drive planning and execution
  • Eagerness to learn new technologies on the fly and ship to production
  • Expert in technical program delivery across cross-functional / LOB teams
  • Expert in driving delivery through collaboration in highly complex, matrixed environment
  • Possesses strong leadership and negotiation skills
  • Excellent communication skills, both written and verbal
  • Ability to interact with senior leadership teams in IT and business Preferred

 

Preferred

  • Insurance domain expertise especially with Life
  • Spanish fluency is critical
  • Working knowledge of English

 

At MetLife, we’re leading the global transformation of an industry we’ve long defined. United in purpose, diverse in perspective, we’re dedicated to making a difference in the lives of our customers.

At MetLife we are committed to promoting diversity among male and female collaborators, through non-discriminatory treatment based on race, gender identity/expression, sexual orientation, religion, age, nationality, marital status, disabilities, physical or economic condition , there are no HIV and pregnancy tests as a requirement for entry, permanence or promotion and there are equal job opportunities.