Your browser does not support javascript! Please enable it, otherwise web will not work for you.

GCP Hadoop Developer/Lead @ Onix

Home > Software Development

Onix  GCP Hadoop Developer/Lead

Job Description

Job Summary

We are seeking a highly skilled Hadoop Developer / Lead Data Engineer to join our data engineering team based in Bangalore or Pune. The ideal candidate will have extensive experience with Hadoop ecosystem technologies and cloud-based big data platforms, particularly on Google Cloud Platform (GCP). This role involves designing, developing, and maintaining scalable data ingestion, processing, and transformation frameworks to support enterprise data needs.


Minimum Qualifications

  • Bachelor's degree in computer science, Computer Information Systems, or related technical field.
  • 5-10 years of experience in software engineering or data engineering, with a strong focus on big data technologies.
  • Proven experience in implementing software development life cycles (SDLC) in enterprise environments.

Technical Skills & Expertise

  • Big Data Technologies:
    • Expertise in Hadoop platform, Hive, and related ecosystem tools.
    • Strong experience with Apache Spark (using SQL, Scala, and/or Java).
    • Experience with real-time data streaming using Kafka.
  • Programming Languages & Frameworks:
    • Proficient in PySpark and SQL for data processing and transformation.
    • Strong coding skills in Python.
  • Cloud Technologies (Google Cloud Platform):
    • Experience with BigQuery for data warehousing and analytics.
    • Familiarity with Cloud Composer (Airflow) for workflow orchestration.
    • Hands-on with DataProc for managed Spark and Hadoop clusters.

Responsibilities

  • Design, develop, and implement scalable data ingestion and transformation pipelines using Hadoop and GCP services.
  • Build real-time and batch data processing solutions leveraging Spark, Kafka, and related technologies.
  • Ensure data quality, governance, and lineage by implementing automated validation and classification frameworks.
  • Collaborate with cross-functional teams to deploy and operationalize data analytics tools at enterprise scale.
  • Participate in production support and on-call rotations to maintain system reliability.
  • Follow established SDLC practices to deliver high-quality, maintainable solutions.

Preferred Qualifications

  • Experience leading or mentoring data engineering teams.
  • Familiarity with CI/CD pipelines and DevOps best practices for big data environments.
  • Strong communication skills with an ability to collaborate across teams.

Job Classification

Industry: IT Services & Consulting
Functional Area / Department: Engineering - Software & QA
Role Category: Software Development
Role: Big Data Engineer
Employement Type: Full time

Contact Details:

Company: Onix
Location(s): Pune

+ View Contactajax loader


Keyskills:   Pyspark Bigquery Hadoop GCP Python Airflow Kafka Dataproc SQL Hive cloud composer Spark Google Cloud Platforms

 Fraud Alert to job seekers!

₹ Not Disclosed

Similar positions

GCP Solution Architect

  • NTT DATA, Inc.
  • 8 - 12 years
  • Hyderabad
  • 8 days ago
₹ Not Disclosed

GCP Solution Architect

  • NTT DATA, Inc.
  • 8 - 12 years
  • Bengaluru
  • 8 days ago
₹ Not Disclosed

Senior Architect - GCP

  • Sacha Engineering
  • 8 - 12 years
  • Chennai
  • 1 day ago
₹ Not Disclosed

Onix is Hiring Hadoop GCP Engineers!!!

  • Datametica
  • 4 - 8 years
  • Pune
  • 2 days ago
₹ Not Disclosed

Onix

Welcome to Vapronix Web, a dynamic and innovative game publishing company that combines cutting-edge technology with human creativity to deliver exceptional digital entertainment experiences. Our mission is to constantly push the boundaries of digital entertainment and create a global platform tha...