Post a job

Job has expired

This job post is expired and is no longer taking new applicants.

Return home Find similar jobs

Senior Data Engineer

V

Location
AO, BF + 58 more
Voyc

Job Description

Working At Voyc

Voyc, an award-winning leader in contact centre AI software, is seeking a Data Engineer with expertise in stream processing pipelines and a deep understanding of Elasticsearch to join our innovative and dynamic team. Our mission is to help financial services companies enhance customer service and ensure compliance by revolutionising the contact centre quality assurance process through cutting-edge AI technology. By transcribing and analysing customer interactions, our AI solution identifies potential problems, enabling us to handle customer complaints and regulatory compliance breaches efficiently. Our solution empowers financial institutions to monitor 100% of customer interactions, a critical need in an industry where compliance and customer service are paramount.

Responsibilities

As a Data Engineer at Voyc, specialising in analytis pipelines and Elasticsearch, you will play a pivotal role in advancing our data infrastructure and analytics capabilities. Your responsibilities will include:

  • Designing, implementing, and maintaining robust data pipelines, ensuring the efficient and reliable flow of data across our systems.
  • Developing and maintaining Elasticsearch clusters, fine-tuning them for high performance and scalability.
  • Collaborating with cross-functional teams to extract, transform, and load (ETL) data into Elasticsearch for advanced analytics and search capabilities.
  • Troubleshooting data pipeline and Elasticsearch issues, ensuring the integrity and availability of data for analytics and reporting.
  • Participating in the design and development of data models and schemas to support business requirements.
  • Continuously monitoring and optimising data pipeline and Elastic performance to meet growing data demands.
  • Collaborating with data scientists and analysts to enable efficient data access and query performance.
  • Contributing to the evaluation and implementation of new technologies and tools that enhance data engineering capabilities.
  • Demonstrating strong analytical, problem-solving, and troubleshooting skills to address data-related challenges.
  • Collaborating effectively with team members and stakeholders to ensure data infrastructure aligns with business needs.
  • Embodying the company values of playing to win, putting people over everything, driving results, pursuing knowledge, and working together.
  • Implementing standards, conventions and best practices.

Our Stack

As a Data Engineer with a focus on Kafka pipelines and Elastic, you will work with the following technologies:

Data Pipelines:

  • Kafka / ksqlDB
  • Python
  • Redis

Data Storage and Analysis:

  • Elasticsearch, cluster management and optimisation
  • AWS S3
  • PostgreSQL

DevOps:

  • AWS

Requirements

Skills and Requirements

To excel in this role, you should possess the following qualifications and skills:

  • Proven experience in designing and implementing data pipelines.
  • Experience with end-to-end testing of analytics pipelines.
  • Expertise in managing and optimising Elasticsearch clusters, including performance tuning and scalability.
  • Strong proficiency with data extraction, transformation, and loading (ETL) processes.
  • Familiarity with data modeling and schema design for efficient data storage and retrieval.
  • Good programming and scripting skills using languages like Python, Scala, or Java.
  • Knowledge of DevOps and automation practices related to data engineering.
  • Excellent communication and collaboration skills to work effectively with cross-functional teams.
  • Strong analytical and problem-solving abilities, with a keen attention to detail.
  • A commitment to staying up-to-date with the latest developments in data engineering and technology.
  • Alignment with our company values and a dedication to driving positive change through data.

Bonus Points

But not required.

  • Experience with data engineering in an agile / scrum environment.
  • Familiarity with ksqlDB / Kafka or other stream processing frameworks.
  • Familiarity of data lakes and the querying thereof.
  • Experience with integrating machine learning models into data pipelines.
  • Familiarity with other data-related technologies and tools.

Benefits

What’s in it for You?

  • Flexible remote work options
  • Competitive compensation package
  • Impactful work that directly benefits customers and companies
  • Inclusive and diverse work environment
  • Opportunity to make a lasting difference while advancing your career
  • Highly regarded company culture
  • Investment in upskilling and professional development

If you're a passionate Data Engineer with expertise in Kafka pipelines and a thorough understanding of Elastic, looking to contribute to cutting-edge technology and make a difference in the financial services industry, we invite you to join our motivated and purpose-driven team at Voyc. Apply now to be part of our journey in transforming customer interactions and compliance monitoring through data innovation.

Advice from our career coach

To excel as a Data Engineer at Voyc, specializing in analytics pipelines and Elastic, you should know the following:

  • Proven experience with data pipelines and end-to-end testing of analytics pipelines.
  • Expertise in managing and optimizing Elasticsearch clusters, including performance tuning.
  • Strong proficiency in ETL processes, data modeling, and schema design.
  • Programming skills in Python, Scala, or Java and knowledge of DevOps practices.
  • Excellent communication and collaboration skills to work effectively across teams.

Stand out as an applicant by highlighting the following:

  • Showcase your experience with Kafka pipelines and Elastic in your resume and cover letter.
  • Demonstrate your problem-solving skills with examples in your past projects or roles.
  • Highlight any experience with data engineering in an agile/scrum environment or familiarity with stream processing frameworks.
  • Emphasize your commitment to staying updated with the latest developments in data engineering and technology.
  • Connect your values with the company's values and showcase how you can drive positive change through data.

Apply for this job

Expired?

Please let Voyc know you found this job with RemoteJobs.org. This helps us grow!

About the job

Sep 14, 2024

Full-time

  1. AO Angola
  2. BF Burkina Faso
  3. BI Burundi
  4. BJ Benin
  5. BW Botswana
  6. CD Congo - Kinshasa
  7. CF Central African Republic
  8. CG Congo - Brazzaville
  9. CI Côte d’Ivoire
  10. CM Cameroon
  11. CV Cape Verde
  12. DJ Djibouti
  13. DZ Algeria
  14. EG Egypt
  15. EH Western Sahara
  16. ER Eritrea
  17. ET Ethiopia
  18. GA Gabon
  19. GH Ghana
  20. GM Gambia
  21. GN Guinea
  22. GQ Equatorial Guinea
  23. GW Guinea-Bissau
  24. IO British Indian Ocean Territory
  25. KE Kenya
  26. KM Comoros
  27. LR Liberia
  28. LS Lesotho
  29. LY Libya
  30. MA Morocco
  31. MG Madagascar
  32. ML Mali
  33. MR Mauritania
  34. MU Mauritius
  35. MW Malawi
  36. MZ Mozambique
  37. NA Namibia
  38. NE Niger
  39. NG Nigeria
  40. RE Réunion
  41. RW Rwanda
  42. SC Seychelles
  43. SD Sudan
  44. SH St. Helena
  45. SL Sierra Leone
  46. SN Senegal
  47. SO Somalia
  48. SS South Sudan
  49. ST São Tomé & Príncipe
  50. SZ Eswatini
  51. TD Chad
  52. TF French Southern Territories
  53. TG Togo
  54. TN Tunisia
  55. TZ Tanzania
  56. UG Uganda
  57. YT Mayotte
  58. ZA South Africa
  59. ZM Zambia
  60. ZW Zimbabwe
RemoteJobs.org mascot