Post a job

Data Engineer

CREATEQ logo

Location
Romania
CREATEQ

Job Description

You will:

Join our agile distributed team of software engineers responsible for setting a strategy for data engineering and developing guidelines across our streaming and data warehouse technologies.
The vision: making zero-carbon happen!

Your key responsibilities will include:

  • Engaging in the data engineering community with engineers working in product teams, providing advice and guidance, tool recommendations etc.
  • Building and maintaining integration pipelines that power our automated journeys
  • Transforming streaming data to meet target schemas
  • Being part of an agile engineering team where you will have the opportunity to influence technology selection
  • Establishing good data engineering practices including using infrastructure as code; contributing to automated testing strategies; setting up monitoring and alerting tools; employing CI/CD best practices to deploy regularly to production
  • Working with key stakeholders to understand their data needs and help deliver solutions that provide them with excellent quality data that allows teams to realise their objectives.

Requirements

Who we are looking for:

Our future colleague should be a passionate data engineer and trustworthy professional, with an attitude of ownership, commitment to deliver, a desire to make an impact, and a collaborative mindset. We are welcoming teammates who enjoy experimenting with data engineering technologies.

The following key skills and experience are required:

  • 4+ years of experience in designing, building, monitoring and managing large-scale data products, pipelines, tooling and platforms
  • A proven track record as a Data Engineer, setting strategy and defining ways of working
  • Experience working on streaming ETL solutions utilising streaming data processing tools (e.g. Kafka Streams, Kinesis, Spark or similar)
  • Experience with DataForm/ Data Fusion
  • Experience developing cloud-based solutions on GCP (preferably), AWS or Azure using Infrastructure as code tools such as Terraform
  • Excellent knowledge of at least one programming language e.g. Scala, Python, Typescript, Java, Kotlin
  • Experience designing and building data pipelines using DataFusion/Dataform or similar
  • An understanding that building quality software is essential and you value automation and continuous delivery
  • A love for building scalable, resilient solutions, and you enjoy influencing the team’s technology selection and architectural direction
  • You will be comfortable working in an agile software development environment and have experience of CI/CD and deployment strategies
  • Adaptability and flexibility when dealing with change and ambiguity
  • Ability to interpret and communicate information in a clear and concise manner to others (technical/non-technical team members etc.)
  • Open and friendly personality, excellent interpersonal and team-working skills as well as a problem-solving mindset and self-improve skills
  • Excellent verbal and written communication skills in English
  • Bachelor or higher degree in computer science (or equivalent).

Benefits

  • Challenging projects in a highly professional, but also a collaborative and supportive environment
  • Working in small and excellently skilled teams
  • Opportunity for long-term professional growth within our development center
  • Competitive compensation depending on experience and skills
  • Respect and support for your professional, family and personal goals.

Advice from our career coach

As someone who has been in the industry for quite some time, I can say that this role for a data engineer with a focus on zero-carbon initiatives is incredibly exciting and impactful. To stand out as an applicant, here are some key tips:

  • Highlight your experience in designing, building, monitoring, and managing large-scale data products and pipelines
  • Showcase your proven track record as a data engineer, setting strategy and defining ways of working
  • Emphasize your experience with streaming ETL solutions and tools like Kafka Streams, Kinesis, or Spark
  • Demonstrate your proficiency in cloud-based solutions and Infrastructure as code tools like Terraform
  • Discuss your expertise in programming languages like Scala, Python, Typescript, Java, or Kotlin
  • Illustrate your understanding of data engineering best practices, automation, and continuous delivery
  • Highlight your ability to work in Agile environments, experience with CI/CD, and adaptability to change
  • Showcase your excellent communication skills and ability to work collaboratively in teams
  • Ensure to mention your academic background in computer science or a related field

Apply for this job

Expired?

Please let CREATEQ know you found this job with RemoteJobs.org. This helps us grow!

RemoteJobs.org mascot