Applications are open to immediate joiners only.
Role & responsibilities
Strong programming skills in Python and SQL.
Experience with data processing leveraging programming skills in Python and Spark
In-depth understanding of the Kafka platform for (real-time) data ingestion and processing of high-volume data.
Design and architect data flows, data management in Cloud environment which are scalable, repeatable and eliminate time consuming steps.
Clear skills in using version control systems like GIT and developing (technical) documentation (e.g., in WIKI).
Experiences at infrastructure components as code for robust deployment, replication, and uniform management
Effective collaboration and communication skills to engage with cross functional teams
A proactive mindset with the ability to drive tasks to completion
Proficiency in Agile/Scrum/Kanban methodologies for efficient product delivery and management.
Familiarity with AWS cloud services like Glue, Athena etc.
Qualifications:
Bachelor or Master's degree in Computer Science/Information Systems
Proven experience in leading and managing teams.
At least 9+ years of experience in building data flows and data management on modern big data tech stack
Strong experience in using ETL framework (eg. Airflow, Jenkins) to build and deploy production-quality ETL pipelines
Knowledge of data structures.
Open to learn and implement new technologies
Ability to think and perform data engineering workstreams with a product mindset Skills.
Preferred candidate profile
Perks and benefits
Saturam is a global Deep Tech company operated by experts in ML-led DataOps & Digital Transformation. We enable enterprises to automate and assemble continuous AI-Ready data for various ML and IoT use cases. REAL-TIME: Be it Predictive Maintenance, Hyper-Local or Personalization, we will archit...