Job Description
Design and deliver consumer-centric high performant systems. You would be dealing with huge volumes of data sets arriving through batch and streaming platforms. You will be responsible to build and deliver data pipelines that process, transform, integrate and enrich data to meet various demands from business
- Mentor team on infrastructural, networking, data migration, monitoring and troubleshooting aspects
- Focus on automation using Infrastructure as a Code (IaaC), Jenkins, devOps etc.
- Design, build, test and deploy streaming pipelines for data processing in real time and at scale
- Experience with stream-processing systems like Storm, Spark-Streaming, Flink etc..
- Experience with object-oriented/object function scripting languages: Scala, Java, etc.
- Develop software systems using test driven development employing CI/CD practices
- Partner with other engineers and team members to develop software that meets business needs
- Follow Agile methodology for software development and technical documentation
- Good to have banking/finance domain knowledge
- Strong written and oral communication, presentation and interpersonal skills.
- Exceptional analytical, conceptual, and problem-solving abilities
- Able to prioritize and execute tasks in a high-pressure environment
- Experience working in a team-oriented, collaborative environment
Employement Category:
Employement Type: Full time
Industry: KPO
Functional Area: IT
Role Category: Software Developer
Role/Responsibilies: Big Data Developer
Contact Details:
Company: Change Leaders
Location(s): Delhi, NCR