Design, develop, test, deploy and maintain scalable data pipelines using AWS services such as Glue, Step Functions, Lambda etc.
Collaborate with cross-functional teams to identify business requirements and design solutions that meet those needs.
Develop Python scripts to automate tasks and workflows within the data pipeline.
Troubleshoot issues related to data processing and ensure high availability of the system.
Job Requirements :
5-10 years of experience in Cloud Engineering with expertise in AWS ecosystem (Glue, Step Functions).
Strong understanding of Dynamo DB and its applications in building scalable databases.
Proficiency in writing Python code for automating tasks and workflows.
Experience working on big data processing projects involving large datasets.
Job Classification
Industry: IT Services & ConsultingFunctional Area / Department: IT & Information SecurityRole Category: IT & Information Security - OtherRole: IT & Information Security - OtherEmployement Type: Full time