Sr Data Ops Engineer - Montevideo, Uruguay - dLocal

dLocal
dLocal
Empresa verificada
Montevideo, Uruguay

hace 1 semana

Saúl de la Cruz

Publicado por:

Saúl de la Cruz

Reclutador de beBee


Descripción

Why you should join dLocal?
dLocal enables the biggest companies in the world to collect payments in 40 countries in emerging markets. Global brands rely on us to increase conversion rates and simplify payment expansion effortlessly.

As both a payments processor and a merchant of record where we operate, we make it possible for our merchants to make inroads into the world's fastest-growing, emerging markets.


By joining us you will be a part of an amazing global team that makes it all happen, in a flexible, remote-first dynamic culture with travel, health, and learning benefits, among others.

Being a part of dLocal means working with 600+ teammates from 25+ different nationalities and developing an international career that impacts millions of people's daily lives.

We are builders, we never run from a challenge, we are customer-centric, and if this sounds like you, we know you will thrive in our team.


What's the opportunity?


We are looking for a Data Ops Engineer with working experience in fast-paced environments, a high tolerance for ambiguity, and a passion for constant learning.

We're looking for motivated, adaptive people who enjoy the challenge of working in a team development solutions.


They must be able to interact with customers and perform not only technical work but can understand needs, extract requirements, and design and propose solutions as well.


What will I be doing?:


  • Developing Python components, Terraform modules and pipelines to integrate and automate various processes and tools.
  • Promoting the use of APIs from distinct systems to extract and update data, trigger and monitor processes and help tie infrastructures.
  • Deploying, maintaining and overseeing cloud infrastructure to ensure it runs with the reliability and performance our customers expect.
  • Helping to create Data Models, best practices, and technical documents for our users.
  • Developing best practices, policies, and processes regarding DevOps and DataOps.
  • Helping to identify opportunities, generate innovative solutions, and improve existing product features.

What skills do I need?:


  • Be strong in Python, bash and/or speak SQL as a second language.
  • Be very familiar with databases, data warehouses and data lakes.
  • Be familiar with development tools (Terraform, GitHub, Docker, dbt).
  • Be knowledgeable of AWS environments (EC2, Lambda, IAM, SQS, RDS, Kinesis, Glue, EKS, etc.)
  • Be adept at data pipelines and process improvement.
  • Work collaboratively but are also able to own a project with little guidance.
  • Thrive in an environment with constant and quick iterations.
  • Seek out creative solutions to challenging problems.
  • Have strong attention to detail.
  • Like to work in a diverse environment and be comfortable with nonhierarchical organizations.
  • AWS associate or professionallevel certifications (Solutions Architect, Developer, DevOps Engineer) are a plus.

What happens after you apply?
Also, you can check out our webpage, Linkedin, Instagram, and Youtube for more about dLocal

Más ofertas de trabajo de dLocal