Marigold helps brands foster customer relationships through the science and art of connection. Marigold Relationship Marketing is a suite of world-class martech solutions that help marketers create long term customer love and loyalty. Marigold’s products address the Messaging, Loyalty and Experiences marketing offerings, to a customer base that is categorized in three segments: Enterprise, Professional and Commercial. Marigold provides the most comprehensive set of use cases for Marketers at any level. Headquartered in Nashville, TN, Marigold has offices globally across the United States, Europe, Australia, New Zealand, Malaysia, India, South America and Central America, as well as in Japan.
The Role
We're looking for a skilled Senior Software Engineer with polyglot across projects building software at scale. As you’ll be joining a Data Engineering team, we’re hoping to see a breath of experience in Python. You should have a solid grasp of software testing principles, and particularly strong database skills. Some familiarity with frontend technologies and containerisation (Docker, Kubernetes) is a plus. This role involves collaboration with product , design and other teams, owning the stack, and participating in an on-call rotation.
While working a standard NZ day, you’ll enjoy the benefits of working with an international team which follows the sun. You'll also get to collaborate with a Principal and Product Owners in the US. The Data Engineering team is dedicated to empowering customers to make informed decisions about their marketing strategies through intuitive and reliable access to data. As a Senior Software Engineer at Sailthru, you'll be pivotal in steering data-driven decision-making processes across various departments, and shaping a modern data architecture. Responsibilities include analyzing extensive datasets, crafting data models, implementing ETL and ELT flows, and working on capabilities that deliver insights to stakeholders.
Responsibilities:
Develop and maintain java and python services in a distributed architecture.
Collaborate with teams to design, implement, and deploy highly scalable solutions.
Write and maintain comprehensive unit and integration tests for the software you produce.
Develop and maintain solutions across data bricks, using strong Java and DB skills like MongoDB(must have), Databricks, Postgres
Confidence in debugging flows across a complex environment; including troubleshooting full-stack or eventing issues, across technologies such as React , Kafka
Experience of debugging and tuning database queries and systems.
Be part of our regular on call rotation with the other team members.
Requirements:
Whilst we do not expect you to have all of the following, we’re looking for an exceptional engineer who would tick most of these boxes:
Bachelor's degree in Computer Science or relevant experience.
5+ years proficiency at a high level in Python and at least one other programming language.
Experience with Object Oriented and Functional programming patterns.
Experience in working with Agile or Lean teams.
Experience with Git, software testing principles, and continuous integration.
Experience or familiarity with Docker and Kubernetes.
Experience developing software making integrating with both NoSQL and relational databases.
Experience with, or desire to learn, Airflow and DataBricks technologies to implement Spark jobs.
Exposure to working with event streaming or publish/subscribe technologies such as Kafka.
Excellent problem-solving and debugging skills, with the ability to solve hard problems in a collaborative environment.
A deep experience of working with AWS is desirable (EC2, S3, lambda, redshift).
EKS or other Kubernetes experience
Experience with observability tools (eg. ELK/OpenSearch, Grafana, Datadog, Cloudwatch, etc).
Familiarity with machine learning techniques and pipelines.
Strong knowledge of SQL
If you're enthusiastic about building scalable and resilient software solutions, come be a part of our pioneering Data Engineering team!