Your browser does not support javascript! Please enable it, otherwise web will not work for you.

Opportunity | GCP Data Engineer | Tavant India @ Tavant Technologies

Home > Software Development

 Opportunity | GCP Data Engineer | Tavant India

Job Description

Dear candidate,


We found your profile suitable for our current opening, please go through the below JD for better understanding of the role,


Job Description :

Role : Technical lead

Exp : 5 - 8 years

Mode of work : Hybrid Model (3daya WFO)

Work Location : Hyderabad/Bangalore/Noida/Pune/Kolkata

Role Overview:

 We are seeking a highly experienced and motivated Senior Data Engineer / DataOps Engineer to join our growing team. This role is critical for designing, building, managing, and optimizing our data infrastructure and pipelines, primarily leveraging the Google Cloud Platform (GCP). The ideal candidate will have a deep understanding of GCP data services, extensive hands-on experience in managing complex data environments, and a passion for implementing robust, scalable, and efficient data solutions. You will play a key role in ensuring the reliability, performance, and operational excellence of our data platform.

Key Responsibilities:

  • Design, develop, deploy, and maintain scalable, resilient, and secure data pipelines (batch and streaming) on GCP using services like Cloud Dataflow, Cloud Composer, and Cloud Functions.
  • Manage and operate core data infrastructure components on GCP, including BigQuery, Bigtable, Google Kubernetes Engine (GKE) for data workloads, and related networking/security configurations.
  • Implement and champion DataOps principles, including CI/CD for data pipelines (using tools like Jenkins), automated testing, robust monitoring (Cloud Monitoring, Cloud Logging), and alerting strategies.
  • Optimize data storage, processing, and query performance across the GCP data stack, focusing on efficiency and cost management.
  • Troubleshoot and resolve complex issues related to data infrastructure, pipeline failures, and data quality.
  • Collaborate closely with data scientists, analysts, software engineers, and product managers to understand data requirements and deliver effective solutions.
  • Ensure data governance, data quality, and security best practices are implemented and adhered to within the GCP environment.
  • Stay current with the latest advancements in GCP data services and broader data engineering/DataOps trends.
  • Evaluate and leverage GCP's AI/ML capabilities where applicable, potentially contributing to intelligent monitoring or operational improvements.

Required Qualifications & Skills:

  • Bachelor's or master's degree in computer science, Engineering, Information Technology, or a related field.
  • 4-7 years of professional experience in data engineering, software engineering with a data focus, or a related infrastructure role.
  • Minimum of 4 years of significant hands-on experience designing, building, and operating data solutions specifically on Google Cloud Platform (GCP).
  • Proven, in-depth experience with core GCP data services:
    • Data Processing: Cloud Dataflow, Cloud Functions
    • Orchestration: Cloud Composer (or managed Airflow)
    • Storage/Warehousing: BigQuery, Bigtable, Google Cloud Storage (GCS)
    • Operations/Monitoring: Cloud Logging, Cloud Monitoring, Cloud Trace
    • Containerization (for data workloads): Google Kubernetes Engine (GKE)
  • Strong experience in managing data infrastructure, including provisioning, configuration, performance tuning, and cost optimization.
  • Proficiency in programming languages commonly used in data engineering, particularly Python and SQL.
  • Solid understanding of ETL/ELT processes, data modeling, data warehousing concepts, and database design.
  • Experience implementing monitoring, logging, and alerting for data pipelines and infrastructure.
  • Familiarity with infrastructure-as-code (IaC) tools (e.g., Terraform).
  • Experience with CI/CD tools and practices (e.g., Cloud Build, Jenkins, GitLab CI).
  • Understanding of containerization (Docker) and orchestration (Kubernetes).
  • Awareness of GCP's AI/ML offerings (e.g., Vertex AI, BigQuery ML).
  • Excellent problem-solving and troubleshooting skills.
  • Strong communication and collaboration abilities.

Please check below link for organisation details https://www.tavant.com/

If interested , please drop your resume to da*********i@ta***t.com

Regards

Dasari Krishna Gowri

Associate Manager - HR

www.tavant.com

Job Classification

Industry: IT Services & Consulting
Functional Area / Department: Engineering - Software & QA
Role Category: Software Development
Role: Data Engineer
Employement Type: Full time

Contact Details:

Company: Tavant Technologies
Location(s): Noida, Gurugram

+ View Contactajax loader


Keyskills:   GCP Bigquery Data Flow Cloud Storage Airflow kafka

 Fraud Alert to job seekers!

₹ Not Disclosed

Similar positions

Azure Data Engineer

  • GSR Business Services
  • 5 - 8 years
  • Bengaluru
  • 18 days ago
₹ 10-19 Lacs P.A.

Senior Technical Delivery Engineer - Customer Success

  • Increff
  • 3 - 6 years
  • Bengaluru
  • 1 month ago
₹ Not Disclosed

Lead Business Intelligence Engineer

  • Zeta
  • 4 - 6 years
  • Hyderabad
  • 1 day ago
₹ Not Disclosed

Principal Data Engineer

  • Trellix
  • 10 - 15 years
  • Bengaluru
  • 1 day ago
₹ Not Disclosed

Tavant Technologies

Tavant Technologies Tavant is a digital products and solutions company that provides impactful results to its customers across a wide range of industries such as Consumer Lending, Aftermarket, Media & Entertainment, and Retail in North America, Europe, and Asia-Pacific. Our solutions, powere...