Role & responsibilities
Design, build, and maintain scalable data pipelines using DBT and Airflow.
Develop and optimize SQL queries and data models in Snowflake.
Implement ETL/ELT workflows, ensuring data quality, performance, and reliability.
Work with Python for data processing, automation, and integration tasks.
Handle JSON data structures for data ingestion, transformation, and APIs.
Leverage AWS services (e.g., S3, Lambda, Glue, Redshift) for cloud-based data solutions. Collaborate with data analysts, engineers, and business teams to deliver high-quality data products.
Preferred candidate profile
Strong expertise in SQL, Snowflake, and DBT for data modeling and transformation.
Proficiency in Python and Airflow for workflow automation.
Experience working with AWS cloud services.
Ability to handle JSON data formats and integrate APIs.
Strong problem-solving skills and experience in optimizing data pipelines