We are looking for a Data Engineer with experience in data warehouse projects, strong expertise in Snowflake , and hands-on knowledge of Azure Data Factory (ADF) and dbt (Data Build Tool). Proficiency in Python scripting will be an added advantage.
Key Responsibilities:
Design, develop, and optimize data pipelines and ETL processes for data warehousing projects.
Work extensively with Snowflake, ensuring efficient data modeling, and query optimization.
Develop and manage data workflows using Azure Data Factory (ADF) for seamless data integration.
Implement data transformations, testing, and documentation using dbt.
Collaborate with cross-functional teams to ensure data accuracy, consistency, and security.
Troubleshoot data-related issues.
(Optional) Utilize Python for scripting, automation, and data processing tasks.
Required Skills & Qualifications:
Experience in Data Warehousing with a strong understanding of best practices.
Hands-on experience with Snowflake (Data Modeling, Query Optimization).
Proficiency in Azure Data Factory (ADF) for data pipeline development.
Strong working knowledge of dbt (Data Build Tool) for data transformations.
(Optional) Experience in Python scripting for automation and data manipulation.
Good understanding of SQL and query optimization techniques.
Experience in cloud-based data solutions (Azure).
Strong problem-solving skills and ability to work in a fast-paced environment.
Experience with CI/CD pipelines for data engineering.
Why Join Us
Opportunity to work on cutting-edge data engineering projects.
Work with a highly skilled and collaborative team.
Exposure to modern cloud-based data solutions.
------ ------Developer / Software Engineer - One to Three Years,Snowflake - One to Three Years------PSP Defined SCU in Solution Architect