We are looking for a skilled Kafka Engineer with hands-on experience in Confluent Platform to design, build, and maintain high-performance, real-time data streaming pipelines. The ideal candidate will have expertise in Kafka ecosystem components and be well-versed in deploying and scaling Kafka clusters, both on-prem and in cloud environments.
Key Responsibilities
Design, build, and manage Kafka and Confluent infrastructure for high throughput streaming applications
Implement Kafka Connect, Schema Registry, and KSQL (KSQLDB), FlinkSQL
Ensure high availability, performance tuning, and monitoring of Kafka clusters
Collaborate with data engineers and backend teams to integrate streaming pipelines with data lakes, warehouses.
Set up and enforce Kafka security mechanisms (SASL, SSL, ACLs, RBAC)
Create and manage CI/CD pipelines for Kafka components using tools like Git, Jenkins, etc.
Monitor and troubleshoot issues using tools like Confluent Control Center, DataDog
Required Skills
Experience in Apache Kafka and Confluent Platform
Experience with Kafka Connect, Kafka Streams, Schema Registry, KSQL, and FlinkSQL
Strong background in Java or Python
Good understanding of stream processing and event-driven architecture