Your browser does not support javascript! Please enable it, otherwise web will not work for you.

Onix is Hiring GCP Data Warehousing Engineer @ Datametica

Home > Software Development

 Onix is Hiring GCP Data Warehousing Engineer

Job Description

We are seeking a highly skilled GCP Data Warehouse Engineer to join our data team. You will be responsible for designing, developing, and maintaining scalable and efficient data warehouse solutions on Google Cloud Platform (GCP). Your work will support analytics, reporting, and data science initiatives across the company.

Key Responsibilities:

  • Design, build, and maintain data warehouse solutions using BigQuery.
  • Develop robust and scalable ETL/ELT pipelines using Dataflow, Cloud Composer, or Cloud Functions.
  • Implement data modeling strategies (star schema, snowflake, etc.) to support reporting and analytics.
  • Ensure data quality, integrity, and security across all pipelines and storage.
  • Optimize BigQuery queries for performance and cost-efficiency (partitioning, clustering, materialized views).
  • Collaborate with data scientists, analysts, and other engineers to deliver high-quality datasets and insights.
  • Monitor pipeline performance and troubleshoot issues using Cloud Monitoring, Logging, and alerting tools.
  • Automate deployment and infrastructure using Terraform, Cloud Build, and CI/CD pipelines.
  • Stay up to date with GCPs evolving services and suggest improvements to our data infrastructure.

Required Skills & Qualifications:

  • Bachelor's degree in Computer Science, Engineering, or related field (or equivalent experience).
  • 3+ years of experience in data engineering or data warehousing roles.
  • Hands-on experience with BigQuery, Cloud Storage, Pub/Sub, and Dataflow.
  • Proficiency in SQL and Python (or Java/Scala).
  • Strong understanding of data modeling, data warehousing concepts, and distributed systems.
  • Experience with Cloud Composer (Airflow), version control (Git), and agile development.
  • Familiarity with IAM, VPC Service Controls, and other GCP security best practices.

Preferred Qualifications:

  • Google Cloud Professional Data Engineer certification.
  • Experience with Looker, Dataform, or similar BI/data modeling tools.
  • Experience working with real-time data pipelines or streaming data.
  • Knowledge of DevOps practices and infrastructure-as-code.

Why Join Us?

  • Work on cutting-edge cloud data architecture at scale.
  • Join a collaborative and fast-paced engineering culture.
  • Competitive salary, flexible work options, and career growth opportunities.
  • Access to learning resources, GCP credits, and certifications.

Job Classification

Industry: IT Services & Consulting
Functional Area / Department: Engineering - Software & QA
Role Category: Software Development
Role: Data Engineer
Employement Type: Full time

Contact Details:

Company: Datametica
Location(s): Hyderabad

+ View Contactajax loader


Keyskills:   gcp Data Warehousing sql DW Bigquery Data Flow Dataproc

 Fraud Alert to job seekers!

₹ Not Disclosed

Similar positions

Engineering Manager

  • Cognizant
  • 11 - 13 years
  • Hyderabad
  • 8 hours ago
₹ Not Disclosed

Senior Engineer /Technical Lead (C++ Linux, QT)

  • Einfochips
  • 9 - 18 years
  • Pune
  • 4 days ago
₹ Not Disclosed

Software Engineer

  • Cognizant
  • 5 - 8 years
  • Coimbatore
  • 4 days ago
₹ Not Disclosed

Sr. Engineering Manager-Integration

  • Ness
  • 15 - 20 years
  • Bengaluru
  • 6 days ago
₹ Not Disclosed

Datametica

DataMetica is the leader in Big Data architecture, Advanced Analytics and Big Data Operations focused on serving large global companies. We provide a fast and reliable integration of Hadoop and related technologies into enterprise operations. Our team is comprised of highly experienced Hadoop, noSQL...