Experience: 6 -10 years
Salary Range: 15 30 LPA
Locations: Chennai, KOCHI
We are looking for an experienced Data Engineer with strong expertise in AWS, Snowflake, and DBT to develop, optimize, and maintain cloud-based data pipelines. This role requires proficiency in building scalable data architectures, ensuring efficient data processing, and integrating modern data transformation practices to deliver high-quality data solutions.
Design and build end-to-end ETL/ELT data pipelines using AWS services, Snowflake, and DBT.
Integrate diverse data sources (e.g., relational, unstructured) into Snowflake data warehouse.
Ensure high performance, scalability, and reliability of data processing workflows.
Design and implement Snowflake data models, tables, views, and schemas.
Leverage Snowflake features like Streams, Tasks, Time Travel, and Zero Copy Cloning for efficient data handling.
Optimize Snowflake performance (queries, storage, and compute resources) and monitor for cost-efficiency.
Develop data transformation models using DBT (Data Build Tool) to create clean, reliable datasets for analytics.
Implement version control, testing, and documentation best practices with DBT.
Collaborate with data analysts and scientists to ensure seamless data access and consistency.
Work with AWS services like S3, Redshift, Lambda, and Glue to facilitate data ingestion, processing, and orchestration.
Automate data workflows and ensure data pipelines are reliable, scalable, and error-free.
Implement data monitoring, logging, and alerting to ensure pipeline reliability.
Collaborate with cross-functional teams to understand data requirements and design solutions.
Document data models, transformation logic, and technical specifications for internal use.
Troubleshoot issues and provide performance optimization recommendations.
Bachelors/Masters degree in Computer Science, Engineering, or a related field.
6-10 years of experience in data engineering with strong hands-on experience in AWS, Snowflake, and DBT.
Solid understanding of Snowflake architecture and performance optimization.
Proficiency in DBT for building, testing, and maintaining data transformation workflows.
Strong experience with AWS services like S3, Lambda, Redshift, and Glue.
Advanced SQL skills and experience in query optimization.
Programming knowledge in Python or Java.
Certifications: AWS Certified Solutions Architect, Snowflake SnowPro, or DBT Fundamentals.
Experience with CI/CD for data pipelines (using tools like Git, Jenkins, Terraform).
Familiarity with data visualization tools (e.g., Power BI, Tableau).
Knowledge of data governance and security practices in cloud environments.
Keyskills: snowflake data engineering dbt data models aws
Hucon Solutions India Pvt.Ltd.Hucon Solutions is an Integrated HR Service Provider for all Corporates all over India. We are backed by a good ERP and enough experience in HR and related activities. It has helped generate career opportunities for more than a million individuals in India. Hucon Solu...