Post a job

Job has expired

This job post is expired and is no longer taking new applicants.

Return home Find similar jobs

DevOps Engineer

Pagos logo

Location
United States
Pagos

Job Description

About Us

At Pagos, we’re passionate about empowering businesses to take control of their payments stack and solve the puzzles standing between them and optimized growth. Our global platform provides developers, product teams, and payments leaders with both a deeper understanding of their payments data and access to new payments technology through user-friendly tools that are easy to implement. To succeed in this, we need creative thinkers who are willing to roll up their sleeves and start building alongside us.

About the Role

As a DevOps Engineer, you’ll bridge the gap between operations and development, and ensure the seamless integration of new projects and tools. You’ll drive the scaling and efficiency of our data infrastructure to support our growth. Doing so will require collaboration with backend engineers, data analysts, and others; it will be your responsibility to ensure these teams can work efficiently and have the proper infrastructure necessary to deliver products to our customers today and in the future. Growing fast is all great, but it also means our systems need to support constantly increasing load and volume.

We’re seeking an action-oriented and collaborative problem solver who thrives in ambiguity and can take on new challenges with optimism in a fast-moving environment. We value team members who are not only skilled in their area of expertise but are also perpetual learners who are committed to growth and contributing to our collective success.

In this role, you will:

  • Maintain and improve our infrastructure for scale, availability, and performance

  • Ensure we are safe and secure against cybersecurity threats, and also stay compliant with industry requirements

  • Serve as a key resource to our engineers and analysts for projects, including building and implementing necessary infrastructure

  • Help ensure our software engineers, machine learning engineers, and analysts can work efficiently and improve the development and release processes

  • Test and examine systems written by others and analyzing results

  • Play a key role in integrating and setting up connections with data providers and internal teams

  • Drive projects from start to finish with a high level of ownership and autonomy

What We’re Looking For

We’re looking for someone with:

  • 5+ years of devops experience working with infrastructures for big data workloads

  • Deep understanding of AWS product offering and services

  • Experience with Kubernetes, Docker, orchestration tools, ETL/ELT frameworks, and common data transfer protocols

  • Experience maintaining and setting up big data pipelines for time series data

  • Proficiency with Terraform, Git, and GitHub workflows

  • Working knowledge of databases and SQL (e.g. Redshift, Delta Lake, Postgres, ElasticSearch, and Redis)

  • A bias for action and a problem-solving attitude

Nice to haves:

  • Experience supporting the training and management of machine learning models

  • Comfort and/or past experience working with Airflow, DBT, Apache Spark, Apache Kafka, Redshift, Grafana

  • Experience working in high-growth, venture-backed startup(s)

  • Experience setting up and maintaining Pagerduty, ensuring operation uptime

Pagos does not accept unsolicited resumes from third-party recruiting agencies. All interested candidates are encouraged to apply directly.

Advice from our career coach

As a DevOps Engineer at Pagos, you will play a crucial role in maintaining and improving the infrastructure for scale, availability, and performance. To stand out as an applicant for this role, you should have 5+ years of devops experience with big data workloads, a deep understanding of AWS services, proficiency with tools like Kubernetes, Docker, and Terraform, and experience with setting up big data pipelines. Here are some specific tips to help you stand out:

  • Highlight your experience working with big data infrastructures and your understanding of AWS services.
  • Showcase your proficiency with Kubernetes, Docker, and Terraform in managing infrastructure.
  • Demonstrate your experience in setting up big data pipelines, working with ETL/ELT frameworks, and common data transfer protocols.
  • Emphasize your knowledge of databases and SQL, as well as your experience with Git and GitHub workflows.
  • Mention any experience you have supporting machine learning models, working with Airflow, Apache Spark, or Apache Kafka, and setting up Pagerduty for operational uptime.

Apply for this job

Expired?

Please let Pagos know you found this job with RemoteJobs.org. This helps us grow!

About the job

Apr 17, 2024

Full-time

  1. US United States

More remote jobs at Pagos

RemoteJobs.org mascot