Your browser does not support javascript! Please enable it, otherwise web will not work for you.

Gcp Data Engineer @ TechnoGen

Home > Software Development

 Gcp Data Engineer

Job Description

Job Title: Data Engineer
Location: Hyderabad, India
Client: TechnoGens Client A Top Global Capability Center (GCC)
Contact: Interested candidates can share profiles at bh*******g@te***********a.com

About the Team

Join a dynamic and fast-paced environment with TechnoGens Client, a top-tier Global Capability Center (GCC), where the Enterprise Data and Analytics (ED&A) Delivery Team is driving the organizations transformation into a data-driven powerhouse.

Key focus areas include:

  • Centralizing data from enterprise systems like ERP, E-Commerce, CRM, Order Management, etc., into a cloud-based data warehouse.
  • Designing robust ETL/ELT pipelines using cutting-edge tools and frameworks.
  • Delivering curated, business-ready datasets for strategic decision-making and self-service analytics.
  • Enforcing enterprise-wide data quality, governance, and testing standards.
  • Orchestrating workflows with tools like Airflow/Cloud Composer.
  • Collaborating with analysts, product owners, and BI developers to meet business goals through data.

Opportunity Overview

We are hiring a Data Engineer / ETL Developer to join our client's Technology & Innovation Center in Hyderabad, India. This role involves designing, developing, and maintaining scalable data pipelines that support enterprise analytics and decision-making. You will work with modern data tools like Google BigQuery, Python, DBT, SQL, and Cloud Composer (Airflow) to integrate diverse data sources into a centralized cloud data warehouse.

Key Responsibilities

  • Design and develop scalable data integration pipelines for structured/semi-structured data from systems like ERP, CRM, E-Commerce, and Order Management.
  • Build analytics-ready pipelines transforming raw data into curated datasets for reporting and insights.
  • Implement modular and reusable DBT-based transformation logic aligned with business needs.
  • Optimize BigQuery performance using best practices (partitioning, clustering, query tuning).
  • Automate workflows with Cloud Composer (Airflow) for reliable scheduling and dependency handling.
  • Write efficient Python and SQL code for ingestion, transformation, validation, and tuning.
  • Develop and maintain strong data quality checks and validation mechanisms.
  • Collaborate cross-functionally with analysts, BI developers, and product teams.
  • Utilize modern ETL platforms like Ascend.io, Databricks, Dataflow, or Fivetran.
  • Contribute to CI/CD processes, monitoring, and technical documentation.

Desired Profile

  • Bachelor's or Master's in Computer Science, Data Engineering, Information Systems, or related fields.
  • 4+ years of hands-on experience in data engineering and analytics pipeline development.
  • Expertise in:
    • Google BigQuery
    • Python for scripting and integration
    • SQL for complex transformations and optimization
    • DBT for modular transformation pipelines
    • Airflow / Cloud Composer for orchestration
  • Solid understanding of ETL/ELT development, data architecture, and governance frameworks.

Preferred (Nice to Have)

  • Experience with Ascend.io, Databricks, Fivetran, or Dataflow.
  • Familiarity with tools like Collibra for data governance.
  • Exposure to CI/CD pipelines, Git-based workflows, and infrastructure automation.
  • Experience with event-driven streaming (Pub/Sub, Kafka).
  • Agile methodology experience (Scrum/Kanban).
  • Excellent communication and problem-solving skills.
  • Ability to handle architecture and design troubleshooting on the ground.
  • Quick learner with an innovative mindset.

Apply Now:
Interested candidates, please send your updated profile to bh*******g@te***********a.com

Job Classification

Industry: IT Services & Consulting
Functional Area / Department: Engineering - Software & QA
Role Category: Software Development
Role: Data Engineer
Employement Type: Full time

Contact Details:

Company: TechnoGen
Location(s): Hyderabad

+ View Contactajax loader


Keyskills:   Data Ingestion GCP Python Real time streaming

 Fraud Alert to job seekers!

₹ Not Disclosed

Similar positions

Azure Data Engineer

  • Hexaware Technologies
  • 4 - 9 years
  • Pune
  • 2 days ago
₹ Not Disclosed

Data Architect

  • Accenture
  • 8 - 13 years
  • Bengaluru
  • 2 days ago
₹ Not Disclosed

Azure Data Engineer

  • Hexaware Technologies
  • 9 - 12 years
  • Pune
  • 8 days ago
₹ 15-25 Lacs P.A.

Azure Data Engineer

  • Hexaware Technologies
  • 6 - 10 years
  • Chennai
  • 14 days ago
₹ Not Disclosed

TechnoGen

About TechnoGen India Pvt Ltd :Founded in 2003, headquartered in Chantilly, Virginia.CMMI Level 3 Global Certified IT Services and Consulting Company.We are Great place to work Certified.Visit us @ https://technogeninc.com/TechnoGen, Inc. is an ISO 9001:2015, ISO 20000-1:2011, ...