Job Description
IT Specialist - Oracle CDC, Kafka Connectors & Docker
Job Overview
We are seeking a skilled IT Specialist with expertise in Oracle Change Data Capture (CDC), Kafka topics, event streams, and running Kafka connectors on Docker containers. The ideal candidate will design, implement, and maintain robust data integration solutions to support our real-time data processing needs.
Key Responsibilities
- Configure and manage Oracle CDC to capture and process real-time data changes.
- Design and maintain Kafka topics and event streams for efficient data flow.
- Deploy and operate Kafka connectors within Docker containers for seamless integration.
- Monitor and optimize performance of data pipelines and streaming processes.
- Collaborate with cross-functional teams to ensure data integrity and system scalability.
- Troubleshoot and resolve issues related to data streaming and containerized environments.
Required Skills and Qualifications
- Bachelor s degree in Computer Science, IT, or related field (or equivalent experience).
- 5+ years of experience with Oracle CDC for real-time data capture.
- Strong knowledge of Oracle databases, specifically CDC capabilities such as LogMiner & XStreams API
- Strong knowledge of Apache Kafka, including topic management and event streaming.
- Proficiency in deploying and managing Kafka connectors in Docker containers.
- Able to deploy java monitoring through JMX for Kafka connectors.
- Familiarity with container orchestration tools (e.g., Kubernetes) is a plus.
- Excellent problem-solving skills and ability to work in a fast-paced environment.
Preferred Qualifications
- Experience with cloud platforms (e.g., AWS).
- Knowledge of additional streaming technologies or data integration tools.
- Strong scripting skills (e.g., Python, Bash) for automation.
IT Specialist - Oracle CDC, Kafka Connectors & Docker
Job Overview
We are seeking a skilled IT Specialist with expertise in Oracle Change Data Capture (CDC), Kafka topics, event streams, and running Kafka connectors on Docker containers. The ideal candidate will design, implement, and maintain robust data integration solutions to support our real-time data processing needs.
Key Responsibilities
- Configure and manage Oracle CDC to capture and process real-time data changes.
- Design and maintain Kafka topics and event streams for efficient data flow.
- Deploy and operate Kafka connectors within Docker containers for seamless integration.
- Monitor and optimize performance of data pipelines and streaming processes.
- Collaborate with cross-functional teams to ensure data integrity and system scalability.
- Troubleshoot and resolve issues related to data streaming and containerized environments.
Required Skills and Qualifications
- Bachelor s degree in Computer Science, IT, or related field (or equivalent experience).
- 5+ years of experience with Oracle CDC for real-time data capture.
- Strong knowledge of Oracle databases, specifically CDC capabilities such as LogMiner & XStreams API
- Strong knowledge of Apache Kafka, including topic management and event streaming.
- Proficiency in deploying and managing Kafka connectors in Docker containers.
- Able to deploy java monitoring through JMX for Kafka connectors.
- Familiarity with container orchestration tools (e.g., Kubernetes) is a plus.
- Excellent problem-solving skills and ability to work in a fast-paced environment.
Preferred Qualifications
- Experience with cloud platforms (e.g., AWS).
- Knowledge of additional streaming technologies or data integration tools.
- Strong scripting skills (e.g., Python, Bash) for automation.
Job Classification
Industry: IT Services & Consulting
Functional Area / Department: Engineering - Software & QA
Role Category: DBA / Data warehousing
Role: Database Administrator
Employement Type: Full time
Contact Details:
Company: Zensar
Location(s): Kolkata
Keyskills:
Computer science
Automation
orchestration
JMX
Data processing
data integrity
Oracle
Apache
Python
Scripting