You are a skilled and detail-oriented Data Engineer with 5-10 years of experience, responsible for designing, developing, and maintaining scalable data solutions. Your expertise lies in Snowflake, Azure Data Factory (ADF), SSIS, SQL, and visualization tools such as Power BI or Tableau. Your primary responsibilities include building robust data pipelines, integrating data sources, and supporting analytics initiatives. In this role, you will design and implement scalable ETL/ELT pipelines using Azure Data Factory, SSIS, and other integration tools. You will develop and optimize complex SQL queries and data models on Snowflake and other RDBMS platforms. Your tasks will involve performing data ingestion, transformation, and loading into Snowflake from various structured and unstructured data sources. Collaboration with business analysts and data scientists is crucial to understand requirements and deliver data solutions for reporting and analytics purposes. Additionally, you will build and publish interactive dashboards using Power BI or Tableau. Monitoring, troubleshooting, and improving the performance of existing data pipelines and jobs will be part of your routine tasks. You will also implement data quality, validation, and governance processes. Participation in data architecture and design discussions to support scalability is essential. To excel in this role, you should possess 5-10 years of experience in Data Engineering, ETL development, and Data Warehousing. Strong hands-on experience with Snowflake, including performance tuning, data modeling, and optimization, is required. Proficiency in Azure Data Factory (ADF) and SSIS for building and scheduling data pipelines is expected. Expertise in writing complex and efficient SQL queries is a must, along with experience in data visualization using Power BI or Tableau. A solid understanding of data modeling concepts (dimensional and relational) and familiarity with cloud data platforms and data lake architecture are essential. Strong problem-solving skills and the ability to work independently as well as in a team are key attributes. Qualifications for this role include Microsoft Azure or Snowflake certifications, experience with CI/CD pipelines and version control (e.g., Git), and knowledge of Python or Spark for data processing (nice to have). Exposure to Agile/Scrum development methodologies is advantageous. A Bachelors or Masters degree in Computer Science, Information Technology, Engineering, or a related field is required.,
Employement Category:
Employement Type: Full timeIndustry: IT Services & ConsultingRole Category: Not SpecifiedFunctional Area: Not SpecifiedRole/Responsibilies: Data Engineer
Contact Details:
Company: First Career CentreLocation(s): Karnataka