Programming & Scripting: Strong programming skills in Python and Linux Bash for automation and data workflows.
Hands on experience in Snowflake.
AWS Cloud Services: In-depth knowledge of AWS EC2, S3, RDS, and EMR to deploy and manage data solutions.
Framework Proficiency: Hands-on experience with Luigi for orchestrating complex data workflows.
Data Processing & Storage: Expertise in Hadoop ecosystem tools and managing SQL databases for data storage and query optimization.
Security Practices: Understanding of data security practices, data governance, and compliance for secure data processing.
Automation & CI/CD: Familiarity with CI/CD tools to support automation of deployment and testing.
Big Data Technologies: Knowledge of big data processing tools like Spark, Hive, or related AWS services.
Should have Good Communication Skills.
Regards
Rajan
You can WhatsApp me your CV also 9270558***
Job Classification
Industry: IT Services & ConsultingFunctional Area / Department: Engineering - Software & QARole Category: Software DevelopmentRole: Data EngineerEmployement Type: Full time