Compartir esta oferta de empleo

Sr Data Engineer

Fecha de Publicación: 7/02/2023

Ubicaciones: Capital Federal, AR

Empresa: MetLife

At LATAM Data Hub (LDH), our mission is to build the next generation data lakehouse for MetLife, and to help deploy it across various LATAM countries. We have developed a world-class, cloud-native platform, to enable reporting, analytics, data supply pipeline, and real time supply of the data to various digital and non digital channels. The platform leverages cutting-edge, open source and proprietary technologies to create a highly configurable system that can be adapted to individual market needs quickly, and at a low cost. The platform runs in a fully containerized, elastic cloud environment, and is designed to scale to serve millions of users.

We are looking for a Senior Data Engineer with a track record of designing and implementing large and complex technology projects at a global scale. The ideal candidate would have a solid foundation in hands-on ETL and analytical warehouse development, understand complexities in managing end to end data pipelines and in-depth knowledge of data governance and data management concepts. To be successful in this role, the candidate would require a balance of product-centric technical expertise and navigating complex deployments with multiple systems and teams. This role requires interaction with technical staff and senior business and IT partners around the world. This position is also responsible for ensuring operational readiness by incorporating configuration management, exception handling, logging, end-to-end batch and real-time data pipeline operationalization for getting data, managing and processing into the hub.

Key Responsibilities:


  • Collect, store, process and analyze large datasets to build and implement extract, transfer, load (ETL) processes
  • Create frameworks and developer tools, design training materials, conduct developer training. Evangelize new ideas, standards, best practices and solutions with the developer and tech community.
  • Develop quality code with thought through performance optimizations in place right at the development stage.
  • Appetite to learn new technologies and be ready to work on new cutting-edge cloud technologies.
  • Ingesting huge volumes data from various platforms for needs and writing high-performance, reliable, and maintainable ETL code
  • Interact with business analysts and functional analysts in getting the requirements and implementing the ETL solutions.
  • Provide technical support to project teams throughout project lifecycle around technology platforms, solution design, security, debugging, profiling, performance tuning etc.


 Essential Business Experience and Technical Skills:


  • 5+ years of ETL and data warehousing development experience
  • 3+ plus years of experience designing ETL and data lakes on cloud or big data based platforms
  • Demonstrated experience with implementing, and deploying scalable and performant data hubs at global scale
  • Demonstrated experience in cutting-end database technologies, and cloud services such as Azure, GCP, Azure, Data Bricks or SnowFlake; deep experience in technologies such as Spark(Scala/python/Java), ADLS, Kafka, SQL, Synapse SQL, Cosmos DB, Graph DBs
  • Hands on expertise in: Implementing analytical data stores in Azure platform using ADLS/Azure data factory /Data Bricks and Cosmos DB (mongo/graph API)
  • Strong analytic skills related to working with unstructured datasets
  • Strong SQL knowledge and data analysis skills for data anomaly detection and data quality assurance.
  • Eagerness to learn new technologies on the fly and ship to production
  • Demonstrated experience in Configuration Management, DevOps automation
  • Excellent communication skills: Demonstrated ability to explain complex technical content to both technical and non-technical audiences
  • Experience working enterprise complex large scale, multi-team, programs in agile development methodologies
  • Experience in Solution implementation, performance testing and tuning; ADLS, Synapse SQL database or GCP Big Query and GCS management and Performance tuning - Partitioning / Bucketing.
  • Bachelor’s degree in computer science or related field



  • Working knowledge of English

At MetLife, we’re leading the global transformation of an industry we’ve long defined. United in purpose, diverse in perspective, we’re dedicated to making a difference in the lives of our customers.