Your browser does not support javascript! Please enable it, otherwise web will not work for you.

Data Engineer @ Haptiq Lab

Home > Data Science & Analytics - Other

 Data Engineer

Job Description

  • Job Description :

  • This position is for a Cloud Data engineer with a background in Python, DBT, SQL and data warehousing for enterprise level systems.

  • Major Responsibilities:

  • Adhere to standard coding principles and standards.
  • Build and optimize data pipelines for efficient data ingestion, transformation and loading from various sources while ensuring data quality and integrity.
  • Design, develop, and deploy python scripts and ETL processes in ADF environment to process and analyze varying volumes of data.
  • Experience of DWH, Data Integration, Cloud, Design and Data Modelling.
  • Proficient in developing programs in Python and SQL
  • Experience with Data warehouse Dimensional data modeling.
  • Working with event based/streaming technologies to ingest and process data.
  • Working with structured, semi structured and unstructured data.
  • Optimize ETL jobs for performance and scalability to handle big data workloads.
  • Monitor and troubleshoot ADF jobs, identify and resolve issues or bottlenecks.
  • Implement best practices for data management, security, and governance within the Databricks environment. Experience designing and developing Enterprise Data Warehouse solutions.
  • Proficient writing SQL queries and programming including stored procedures and reverse engineering existing process.
  • Perform code reviews to ensure fit to requirements, optimal execution patterns and adherence to established standards.
  • Checking in, checkout and peer review and merging PRs into git Repo.
  • Knowledge of deployment of packages and code migrations to stage and prod environments via CI/CD pipelines.

  • Skills:
  • 3+ years Python coding experience.
  • 5+ years - SQL Server based development of large datasets
  • 5+ years with Experience with developing and deploying ETL pipelines using Databricks Pyspark.
  • Experience in any cloud data warehouse like Synapse, ADF, Redshift, Snowflake.
  • Experience in Data warehousing - OLTP, OLAP, Dimensions, Facts, and Data modeling.
  • Previous experience leading an enterprise-wide Cloud Data Platform migration with strong architectural and design skills.
  • Experience with Cloud based data architectures, messaging, and analytics.
  • Cloud certification(s).

  • Add ons:
  • Any experience with Airflow , AWS lambda, AWS glue and Step functions is a Plus.

Job Classification

Industry: IT Services & Consulting
Functional Area / Department: Data Science & Analytics
Role Category: Data Science & Analytics - Other
Role: Data Science & Analytics - Other
Employement Type: Full time

Contact Details:

Company: Haptiq Lab
Location(s): Pune

+ View Contactajax loader


Keyskills:   Data Engineering ETL Data Bricks Python SQL Pyspark redshift Snowflake Synapse Analytics Data Warehousing Data Modeling Elt

 Fraud Alert to job seekers!

₹ Not Disclosed

Similar positions

Data Scientist or intern

  • GLOBAL MNC
  • 0 - 5 years
  • Hyderabad
  • 17 hours ago
₹ 10-18 Lacs P.A.

Sr Associate-Data Science

  • Cognizant
  • 6 - 8 years
  • Hyderabad
  • 4 days ago
₹ Not Disclosed

ML Data Associate-II

  • Amazon
  • 1 - 6 years
  • Hyderabad
  • 4 days ago
₹ Not Disclosed

Digital Associate, Ring Data Engineering Services

  • Amazon
  • 2 - 4 years
  • Hyderabad
  • 4 days ago
₹ Not Disclosed

Haptiq Lab

\\n\\n Honeybee Tech Solutions is a global Talent solutions provider with mission and passion to connect world-class talent with marquee global companies in ways that make work incredibly meaningful, valuable and beneficial for everyone. At Honeybee Tech Solutions, We are at the intersection of tech...