Compartir esta oferta de empleo

Lead Data Engineer

Fecha de Publicación: 28/07/2022

Ubicaciones: Capital Federal, AR

Empresa: MetLife

At LATAM Data Hub (LDH), our mission is to build the next generation data lakehouse for MetLife, and to help deploy it across various LATAM countries. We have developed a world-class, cloud-native platform, to enable reporting, analytics, data supply pipeline, and real time supply of the data to various digital and non digital channels. The The platform leverages cutting-edge, open source and proprietary technologies to create a highly configurable system that can be adapted to individual market needs quickly, and at a low cost. The platform runs in a fully containerized, elastic cloud environment, and is designed to scale to serve millions of users.

We are looking for a Lead Data Engineer with a track record of designing and implementing large and complex technology projects at a global scale. The ideal candidate would have a solid foundation in hands-on ETL and analytical warehouse development, understand complexities in managing end to end data pipelines and in-depth knowledge of data governance and data management concepts. To be successful in this role, the candidate would require a balance of product-centric technical expertise and navigating complex deployments with multiple systems and teams. This role requires interaction with technical staff and senior business and IT partners around the world. This role would include engagement with multiple concurrent teams and projects. This role would also require oversight and leadership over technical staff.

This position is also responsible for ensuring operational readiness by incorporating configuration management, exception handling, logging, end-to-end batch and real-time data pipeline operationalization for getting data, managing and processing into the hub.

 

 

 Key Responsibilities:

 

  • Research, evaluate, document, and maintain standards, best practices, design patterns around usage, testing, security, performance, and various other aspects of existing and emerging ETL technologies on cloud and Big data
  • Lead and drive operational delivery, changes, and maintenance of the existing application
  • Collect, store, process and analyze large datasets to build and implement extract, transfer, load (ETL) processes
  • Create frameworks and developer tools, design training materials, conduct developer training. Evangelize new ideas, standards, best practices and solutions with the developer and tech community.
  • Develop quality code with thought through performance optimizations in place right at the development stage.
  • Appetite to learn new technologies and be ready to work on new cutting-edge cloud technologies.
  • Ingesting huge volumes data from various platforms for needs and writing high-performance, reliable, and maintainable ETL code
  • Interact with business analysts and functional analysts in getting the requirements and implementing the ETL solutions.
  • Analyze local business needs, local country IT systems, local market needs including locale, data regulatory needs and present proposals that include high level technical solution, estimate and project plan.
  • Provide technical support to project teams throughout project lifecycle around technology platforms, solution design, security, debugging, profiling, performance tuning etc.
  • Provide governance over project teams to ensure standards and best practices are being followed.

 Essential Business Experience and Technical Skills:

 Required:

  • 8+ years of ETL and data warehousing development experience
  • 3+ plus years of experience designing ETL and data lakes on cloud or big data based platforms
  • Demonstrated experience with designing, implementing, and deploying scalable and performant data hubs at global scale
  • Demonstrated experience in cutting-end database technologies, and cloud services such as Azure, GCP, Azure, Data Bricks or SnowFlake; deep experience in technologies such as Spark(Scala/python/Java), ADLS, Kafka, SQL, Synapse SQL, Cosmos DB, Graph DBs
  • Hands on expertise in: Implementing analytical data stores in Azure platform using ADLS/Azure data factory /Data Bricks and Cosmos DB (mongo/graph API)
  • Strong analytic skills related to working with unstructured datasets
  • Strong SQL knowledge and data analysis skills for data anomaly detection and data quality assurance.
  • Eagerness to learn new technologies on the fly and ship to production
  • Demonstrated experience in Configuration Management, DevOps automation
  • Excellent communication skills: Demonstrated ability to explain complex technical content to both technical and non-technical audiences
  • Experience working enterprise complex large scale, multi-team, programs in agile development methodologies
  • Experience in Solution Design, Data modelling, performance testing and tuning; ADLS, Synapse SQL database or GCP Big Query and GCS management and Performance tuning - Partitioning / Bucketing.
  • Bachelor’s degree in computer science or related field

 

Preferred:

  • Working knowledge of English
  • Experience designing multi-tenant SaaS systems
  • Expertise in Python and experience writing Azure functions using Python/Node.js
  • Experience using Event Hub for data integrations.
  • Able to define and prioritize stories

At MetLife, we’re leading the global transformation of an industry we’ve long defined. United in purpose, diverse in perspective, we’re dedicated to making a difference in the lives of our customers.