5 + years, Immediate to 30 days. (All 5 Days work from office for 9 Hours).
Experience with any modern ETL tools (PySpark or EMR, or Glue or others).
Experience in AWS, programming knowledge in python, Java, Snowflake.
Experience in DBT, StreamSets (or similar tools like Informatica, Talend), migration work done in the past.
Agile experience is required with Version One or Jira tool expertise.
Provide hands-on technical solutions to business challenges & translates them into process/ technical solutions. Good knowledge of CI/CD and DevOps principles
Experience in data technologies - Hadoop PySpark / Scala (Any one)
Contact Person- Sheena Rakesh
Job Classification
Industry: IT Services & ConsultingFunctional Area / Department: Engineering - Software & QARole Category: Software DevelopmentRole: Data EngineerEmployement Type: Full time