Your browser does not support javascript! Please enable it, otherwise web will not work for you.

Data Lake Engineer, AVP @ Deutsche Bank

Home > Software Development

 Data Lake Engineer, AVP

Job Description

Role Description

  • As a Data Engineer, you will work closely with Data Lake Lead Engineer, Data Lake Solution architect and engineering team to design, maintain, and develop innovative solution within scalable GCP cloud environment.
  • As a Data Engineer design, build, and maintain a CSO Data Lake's data infrastructure, ensuring accurate and timely data is accessible for analysis and applications, often working closely with Architect and business analysts. You are highly technical, understands cloud technologies, and understands the complex world of cloud eco-systems and integrations.

Your key responsibilities

  • Cloud Data Infrastructure Development:
    • Design, build, and maintain data storage systems.
    • Develop and implement efficient data pipelines for data ingestion, transformation, and loading.
    • Ensure data is clean, consistent, and accessible for various users and applications.
    • Collaborate with architecture teams to implement cloud-based solutions aligned with business objectives.
  • Data Quality and Reliability:
    • Implement methods to improve data reliability and quality.
    • Combine raw data from different sources into consistent and machine-readable formats.
    • Develop and test architectures that enable data extraction and transformation for predictive or prescriptive modelling.
    • Develop and maintain automation scripts for various environments, ensuring smooth and efficient deployment processes.
    • Use Python and other scripting languages to automate workflows, integrate APIs, and manage infrastructure as code (IaC).
  • Collaboration and Communication:
    • Work closely with data scientists, business analysts, and other stakeholders to understand their data needs.
    • Communicate technical information clearly and effectively to both technical and non-technical audiences.
  • Data Management and Optimization:
    • Manage and optimize data storage, retrieval, and processing.
    • Monitor data pipelines and systems for performance and identify areas for improvement.
    • Develop and implement solutions to address data quality issues and performance bottlenecks.

Change & Operations Management:

  • Contribute to change management processes in live environments, ensuring minimal disruption to service and also complying with the Banks Compliance.
  • Automate routine tasks such as deployments, scaling, and monitoring to improve operational efficiency.
  • Review and provide feedback on the consumption of resources and ensure they are within agreed budgets and participate in on-call rotations to provide timely response and resolution to critical incidents.
  • Take ownership of deliverables, troubleshoot, and resolve issues. Establish product support procedures, service as final L3 engineering escalation.

Your skills and experience

Technical Skills:

  • Proficiency in programming languages like Python, SQL, or Scala.
  • Experience with database technologies (e.g., relational databases, NoSQL databases).
  • Knowledge of data warehousing and data lake technologies.
  • Familiarity with cloud platforms (e.g., AWS, Azure, GCP).
  • Experience with data pipelines and ETL (Extract, Transform, Load) tools
  • In-depth understanding of GCP services and capabilities, including virtual machines, storage, networking, security, and monitoring.
  • Strong expertise in infrastructure as code (IaC) and automation using tools such as Terraform, CloudFormation, or similar.
  • Familiarity with DevOps practices and tools, such as GitLab, DevOps, or Jenkins.
  • Knowledge of various Security technologies. Strong knowledge in current security threats and corresponding technologies.
  • Experience with business tools including Jira, Confluence, Share point, and Microsoft 365

Soft Skills:

  • Strong problem-solving and analytical skills.
  • Excellent communication and collaboration skills.
  • Ability to work independently and as part of a team.

Education:

  • A bachelor's degree in computer science, information technology, or a related field is typically required.
  • Experience with data engineering certifications like IBM Certified Data Engineer or Google's Certified Professional is a plus.

Job Classification

Industry: Investment Banking / Venture Capital / Private Equity
Functional Area / Department: Engineering - Software & QA,
Role Category: Software Development
Role: Data Engineer
Employement Type: Full time

Contact Details:

Company: Deutsche Bank
Location(s): Pune

+ View Contactajax loader


Keyskills:   Data Engineering Change Management Azure GCP Scala Data Management AWS Python SQL Operations Management

 Fraud Alert to job seekers!

₹ Not Disclosed

Similar positions

Azure Data Engineer

  • GSR Business Services
  • 5 - 8 years
  • Bengaluru
  • 18 days ago
₹ 10-19 Lacs P.A.

Principal Data Engineer

  • Trellix
  • 10 - 15 years
  • Bengaluru
  • 1 day ago
₹ Not Disclosed

Python + Data Engineer

  • Wissen Technology
  • 5 - 10 years
  • Mumbai
  • 2 days ago
₹ Not Disclosed

Genpact Hiring For Azure data engineer with Kafka

  • Genpact
  • 6 - 11 years
  • Hyderabad
  • 2 days ago
₹ Not Disclosed

Deutsche Bank