Post a job

Kafka Developer & Administrator

GCL

Location
India
Gradient Cyber LLC

Job Description

Overview:

We are seeking a talented Kafka Developer & Administrator to join our team. In this role, you will be responsible for developing, deploying, and maintaining Apache Kafka infrastructure and components. You will play a key role in designing and implementing Kafka-based solutions to support our data processing, streaming, and messaging needs.

About Gradient Cyber:

Gradient Cyber is a leading player in the cyber security industry, dedicated to delivering innovative solutions that transform businesses. We focus on providing 24/7 protection with our Managed Extended Detection and Response (MXDR) services, specifically designed for the mid-market. Our customer-centric approach and innovative product roadmap set us apart in the industry. At Gradient Cyber, we are committed to fostering an environment that promotes individual growth and success. We recruit exceptional individuals who are eager to excel, contribute innovative ideas, and advance within a company brimming with possibilities. Join us and play a pivotal role in shaping the future of our product portfolio and customer experience.

Key Responsibilities:

  • Kafka Development: Design, develop, and deploy Kafka-based solutions for real-time data processing, streaming, and messaging applications.
  • Component Maintenance: Administer and maintain Kafka clusters, brokers, topics, partitions, and other components to ensure optimal performance, reliability, and scalability.
  • Performance Tuning: Proactively monitor Kafka clusters and fine-tune configurations to optimize performance, throughput, and latency.
  • High Availability and Fault Tolerance: Implement and manage Kafka replication, mirroring, and clustering to ensure high availability (HA) and fault tolerance.
  • Data Integration: Integrate Kafka with other data systems, databases, and applications to facilitate seamless data ingestion, transformation, and consumption.
  • Security: Configure and enforce security policies, authentication, authorization, and encryption mechanisms to protect data in transit and at rest within Kafka clusters.
  • Monitoring and Alerting: Set up monitoring tools and alerts to detect and respond to issues, anomalies, and performance bottlenecks in Kafka infrastructure.
  • Backup and Recovery: Develop and maintain backup and recovery procedures to safeguard data integrity and recover from failures or disasters.
  • Documentation: Document Kafka configurations, deployment procedures, best practices, and troubleshooting guides for knowledge sharing and reference.
  • Collaboration: Collaborate with development teams, data engineers, system administrators, and other stakeholders to understand requirements, design solutions, and troubleshoot issues.

Requirements

  • Bachelor’s degree in Computer Science, Information Technology, or a related field (or equivalent experience).
  • Proven experience as a Kafka Developer or Administrator in a production environment.
  • Strong understanding of Kafka architecture, messaging paradigms, and distributed systems concepts.
  • Proficiency in Kafka configuration, deployment, and performance tuning.
  • Experience with Kafka APIs, producers, consumers, Kafka Connect, and Kafka Streams for data integration and processing.
  • Familiarity with Kafka security mechanisms (e.g., SSL/TLS, SASL, ACLs) and best practices.
  • Hands-on experience with Kafka monitoring and management tools (e.g., Confluent Control Center, Kafka Manager, Prometheus, Grafana).
  • Excellent troubleshooting and problem-solving skills, with the ability to analyze complex issues in distributed environments.
  • Strong scripting and automation skills (e.g., Bash, Python) for managing Kafka infrastructure and deployments.
  • Good communication and collaboration skills, with the ability to work effectively in a cross-functional team environment.

Preferred Qualifications:

  • Kafka certification (e.g., Confluent Certified Developer).
  • Experience with Apache ZooKeeper, a distributed coordination service used by Kafka.
  • Knowledge of cloud platforms and services (e.g., AWS, Azure, Google Cloud) and their integration with Kafka.
  • Familiarity with containerization technologies (e.g., Docker, Kubernetes) and orchestration tools.
  • Experience with stream processing frameworks (e.g., Apache Flink, Apache Spark Streaming).

Equal Opportunity Statement:

Gradient Cyber is an equal-opportunity employer. We are committed to creating a diverse and inclusive workplace where all employees feel valued and respected. We encourage applications from individuals of all backgrounds and experiences.

Apply for this job

Expired?

Please let Gradient Cyber LLC know you found this job with RemoteJobs.org. This helps us grow!

About the job

May 25, 2024

Full-time

  1. IN India

More remote jobs at Gradient Cyber LLC

RemoteJobs.org mascot