Compartir esta oferta de empleo

Sr API Developer

Fecha de Publicación: 31/01/2023

Ubicaciones: Capital Federal, AR

Empresa: MetLife

With more than 150 years in business and presence in more than 40 countries, MetLife is leading the global transformation of the insurance industry.  United in purpose and with different perspectives, we are a collaborative community of more than 40.000 collaborators worldwide, committed in building a more confident future for all of our stakeholders — employees, customers, shareholders and the communities we serve.

What you will do in this role…

  • Collect, store, process and analyze large datasets to build and implement extract, transfer, load (ETL) processes 
  • Create frameworks and developer tools, design training materials, conduct developer training. Evangelize new ideas, standards, best practices and solutions with the developer and tech community. 
  • Develop quality code with thought through performance optimizations in place right at the development stage. 
  • Appetite to learn new technologies and be ready to work on new cutting-edge cloud technologies. 
  • Ingesting huge volumes data from various platforms for needs and writing high-performance, reliable, and maintainable ETL code 
  • Interact with business analysts and functional analysts in getting the requirements and implementing the ETL solutions.  
  • Provide technical support to project teams throughout project lifecycle around technology platforms, solution design, security, debugging, profiling, performance tuning etc. 


To help you succeed, you need to have…

  • 5+ years of ETL and data warehousing development experience  
  • 3+ plus years of experience designing ETL and data lakes on cloud or big data based platforms  
  • Demonstrated experience with implementing, and deploying scalable and performant data hubs at global scale 
  • Demonstrated experience in cutting-end database technologies, and cloud services such as Azure, GCP, Azure, Data Bricks or SnowFlake; deep experience in technologies such as Spark (Scala/python/Java), ADLS, Kafka, SQL, Synapse SQL, Cosmos DB, Graph DBs [in Azure use SPARK] - Mandatory 
  • Hands on expertise in: Implementing analytical data stores in Azure platform using ADLS/Azure data factory /Data Bricks and Cosmos DB (mongo/graph API) 
  • Strong analytic skills related to working with unstructured datasets 
  • Strong SQL knowledge and data analysis skills for data anomaly detection and data quality assurance. 
  • Eagerness to learn new technologies on the fly and ship to production 
  • Demonstrated experience in Configuration Management, DevOps automation 
  • Excellent communication skills: Demonstrated ability to explain complex technical content to both technical and non-technical audiences 
  • Experience working enterprise complex large scale, multi-team, programs in agile development methodologies 
  • Experience in Solution implementation, performance testing and tuning; ADLS, Synapse SQL database or GCP Big Query and GCS management and Performance tuning - Partitioning / Bucketing. 
  • Bachelor’s degree in computer science or related field 


And it will be a plus if you…

  • Working knowledge of English  


The benefits we offer…

  • Hybrid work mode
  • Learning and development programs
  • Discounts at universities
  • Health insurance for the family group
  • In-company gym
  • Child Care Reimbursement
  • Connectivity Reimbursement
  • Day off for birthdays
  • Cultural Heritage Day off
  • Flex Fridays
  • Healthy breakfast with seasonal fruits
  • Garage
  • Agreement with SportClub for the employee and direct relatives.
  • Agreement with "Club de Beneficios" purchases of warehouse products.


Are you passionate about bringing your unique background and skills in deliver data driven decisions across the main critical business functions  to a team of talented, diverse people?   If you are looking for personal growth and a chance to make an impact, we invite you to apply to this role.


Join MetLife and let’s find out what we can build together!