Your browser does not support javascript! Please enable it, otherwise web will not work for you.

Principal Consultant - Big Data Admin @ Genpact

Home > Software Development

 Principal Consultant - Big Data Admin

Job Description

Please find the below detailed JD for Bigdata Administrator

Key Responsibilities:

  • Lead CDP platform upgrades and migrations, with strong hands-on execution and documentation from planning to go-live.
  • Administer and tune Hadoop ecosystem services:
    • Core: HDFS, YARN, Hive, Hue, Impala, Sqoop, Oozie
    • Streaming: Apache Kafka (broker/topic ops), Apache Flink (streaming jobs)
    • NoSQL/Query: HBase, Phoenix
    • Security: Kerberos, Ranger, LDAP, TLS
  • Manage Cribl Stream deployments: build, configure, secure, and optimize data routing pipelines.
  • Monitor and optimize platform performance using Cloudera Manager, NewRelic, BigPanda, Prometheus, Grafana, or any other observability tools.
  • Design and implement backup, recovery, HA, and DR strategies for critical data infrastructure.
  • Automate platform operations using Python, Bash/Shell, Scala, and CI/CD workflows.
  • Work cross-functionally with Data Engineers, DevOps, InfoSec, and Cloud Engineering teams to support data pipeline reliability and scalability.
  • Manage deployments using Docker, Kubernetes, Jenkins, Bitbucket, and optionally Ansible or GitOps practices.
  • Support and maintain cloud-native or hybrid deployments, especially in GCP (Anthos) environments.
  • Produce and maintain robust architecture documentation, runbooks, and operational SOPs.

Required Qualifications:

  • 7+ years of experience in Big Data infrastructure, administration, and operations.
  • Proven Cloudera CDP (7.x) experience, including production-grade migrations (7.1.6 to 7.1.9+).
  • Deep expertise in:
    • Apache Spark job tuning, executor/resource optimization
    • Apache Kafka security (SASL_SSL, GSSAPI), scaling, topic lifecycle management
    • Apache Flink real-time stream processing in HA environments
    • Cribl Stream full-lifecycle management and observability integration
    • HBase & Phoenix schema evolution, read/write tuning, replication
  • Scripting & Automation: Proficient in Python, Shell (Bash), and optionally Scala
  • Security-first mindset: Working knowledge of Kerberos, Ranger policies, LDAP integration, and TLS configuration.
  • DevOps Experience: Hands-on with Docker, Kubernetes, Jenkins, Bitbucket, and monitoring tools like Grafana/Prometheus.
  • Comfortable supporting large-scale, multi-tenant environments and production on-call rotations.

Preferred Qualifications:

  • Cloudera Certified Administrator (CCA) or equivalent industry certification.
  • Experience with BD on-prem, cloud and hybrid data infrastructure, particularly Google Cloud Platform (GCP) and Anthos clusters.

Job Classification

Industry: IT Services & Consulting
Functional Area / Department: Engineering - Software & QA
Role Category: Software Development
Role: Big Data Engineer
Employement Type: Full time

Contact Details:

Company: Genpact
Location(s): Hyderabad

+ View Contactajax loader


Keyskills:   Hadoop Administration cribl Big Data Administration

 Fraud Alert to job seekers!

₹ 15-27.5 Lacs P.A

Similar positions

Sap Abap Consultant

  • Sapphirus Systems
  • 3 - 8 years
  • Mumbai
  • 2 days ago
₹ 5-14 Lacs P.A.

Front-end Developer (Angular / JavaScript)

  • Innspark Solutions
  • 2 - 5 years
  • Kollam
  • 15 hours ago
₹ -5.5 Lacs P.A.

Back-end Developer

  • Innspark Solutions
  • 2 - 5 years
  • Kollam
  • 16 hours ago
₹ -6 Lacs P.A.

Front-end Developer (Angular / JavaScript)

  • Innspark Solutions
  • 2 - 5 years
  • Kollam
  • 16 hours ago
₹ -5.5 Lacs P.A.

Genpact

Genpact Deutschland