Your browser does not support javascript! Please enable it, otherwise web will not work for you.

GCP Architect - ETL @ Datametica

Home > DBA / Data warehousing

 GCP Architect - ETL

Job Description

Location: Hyderabad

Exp: 8+ Years

Immediate Joiners Preferred


We at Datametica Solutions Private Limited are looking for a GCP Data Architect who has a passion for cloud, with knowledge and working experience of GCP Platform. This role will involve understanding business requirements, analyzing technical options and providing end to end Cloud based ETL Solutions.

Required Past Experience:

  • 10 + years of overall experience in architecting, developing, testing & implementing Big data projects using GCP Components (e.g. BigQuery, Composer, Dataflow, Dataproc, DLP, BigTable, Pub/Sub, Cloud Function etc.).
  • Experience and understanding on ETL - AB initio
  • Minimum 4 + years experience with data management strategy formulation, architectural blueprinting, and effort estimation.
  • Cloud capacity planning and Cost-based analysis.
  • Worked with large datasets and solving difficult analytical problems.
  • Regulatory and Compliance work in Data Management.
  • Tackle design and architectural challenges such as Performance, Scalability, and Reusability
  • Advocate engineering and design best practices including design patterns, code reviews and automation (e.g., CI|CD, test automation)
  • E2E Data Engineering and Lifecycle (including non-functional requirements and operations) management.
  • Work with client teams to design and implement modern, scalable data solutions using a range of new and emerging technologies from the Google Cloud Platform
  • Fundamentals of Kafka,Pub/Sub to handle real-time data feeds.
  • Good Understanding of Data Pipeline Design and Data Governance concepts
  • Experience in code deployment from lower environment to production.
  • Good communication skills to understand business requirements.

Required Skills and Abilities:

  • Mandatory Skills - BigQuery ,Composer, Python, GCP Fundamentals.
  • Secondary Skills Dataproc, Kubernetes, DLP, Pub/Sub, Dataflow,Shell Scripting,SQL, Security(Platform & Data) concepts.
  • Expertise in Data Modeling
  • Detailed knowledge of Data Lake and Enterprise Data Warehouse principles
  • Expertise in ETL Migration from On-Primes to GCP Cloud
  • Familiar with Hadoop ecosystems, HBase, Hive, Spark or emerging data mesh patterns.
  • Ability to communicate with customers, developers, and other stakeholders.
  • Good To Have - Certifications in any of the following: GCP Professional Cloud Architect, GCP Professional Data Engineer
  • Mentor and guide team members
  • Good Presentation skills
  • Strong Team Player

Job Classification

Industry: IT Services & Consulting
Functional Area / Department: Engineering - Software & QA
Role Category: DBA / Data warehousing
Role: Data warehouse Architect / Consultant
Employement Type: Full time

Contact Details:

Company: Datametica
Location(s): Hyderabad

+ View Contactajax loader


Keyskills:   GCP Bigquery ETL Abinitio Data Flow Dataproc

 Fraud Alert to job seekers!

₹ Not Disclosed

Similar positions

Python Data Architect

  • Wipro
  • 8 - 10 years
  • Chennai
  • 1 day ago
₹ Not Disclosed

Test Scorer - ACT

  • Pearson
  • 1 - 4 years
  • Kolkata
  • 1 day ago
₹ Not Disclosed

Senior Data Engineer - Java I Databricks | Azure | PySpark

  • Refrelay
  • 4 - 9 years
  • Pune
  • 2 days ago
₹ 24-30 Lacs P.A.

Data Architect

  • Rubis Software
  • 10 - 18 years
  • Hyderabad
  • 2 days ago
₹ Not Disclosed

Datametica

DataMetica is the leader in Big Data architecture, Advanced Analytics and Big Data Operations focused on serving large global companies. We provide a fast and reliable integration of Hadoop and related technologies into enterprise operations. Our team is comprised of highly experienced Hadoop, noSQL...