Your browser does not support javascript! Please enable it, otherwise web will not work for you.

Module Lead - Data Fabric @ Idfy

Home > Software Development

 Module Lead - Data Fabric

Job Description

  • a Module Lead in Data Fabric POD, you would be responsible for producing and implementing functional software solutions
  • You will work with upper management to define software requirements and take the lead on operational and technical projects
  • You would be working with a data management and science platform which provides Data as a service (DAAS) and Insight as a service (IAAS) to internal employees and external stakeholders
  • You are eager to learn technology-agnostic and love working with data and drawing insights from it
  • You have excellent organization and problem-solving skills and are looking to build the tools of the future
  • You have exceptional communication skills and leadership skills and the ability to make quick decisions
Educational Qualifications - B.Tech/B.Ein Computers,
 
Your job description

  1. Work break-down and orchestrating the development of components for each sprint.
  2. Identifying risks and forming contingency plans to mitigate them.
  3. Liaising with team members, management, and clients to ensure projects are completed to standard.
  4. Inventing new approaches to detecting existing fraud. You will also stay ahead of the game by predicting future fraud techniques and building solutions to prevent them.
  5. Developing Zero Defect Software that is secured, instrumented, and resilient.
  6. Creating design artifacts before implementation.
  7. Developing Test Cases before or in parallel with implementation.
  8. Ensuring software developed passes static code analysis, performance, and load test.
  9. Developing various kinds of components (such as UI Components, APIs, Business Components, image Processing, etc ) that define the IDfy Platforms which drive cutting-edge Fraud Detection and Analytics.
  10. Developing software using Agile Methodology and tools that support the same.
Skills Required: Airflow,ETL,ETL pipeline design, Spark,Hadoop, Hive, System Architecture,
 
Requirements:
  1. Know-how of Apache BEAM, Clickhouse, Grafana, InfluxDB, Elixir, BigQuery, Logstash.
  2. An understanding of Product Development Methodologies.
  3. Strong understanding of relational databases especially SQL and hands-on experience with OLAP.
  4. Experience in creating data ingestion pipelines and ETL(Extract, Transform & Load) pipelines (Good to have Apache Beam or Apache Airflow experience).
  5. Strong design skills in defining API Data Contracts / OOAD / Microservices / Data Models.
  6. Experience with Time Series DBs (we use InfluxDB) and Alerting / Anomaly Detection Frameworks.
  7. Visualization Layers: Metabase, PowerBI, Tableau.
  8. Experience in developing software in the Cloud such as GCP / AWS.
  9. A passion for exploring new technologies and express yourself through technical blogs

Job Classification

Industry: Recruitment / Staffing
Functional Area / Department: Engineering - Software & QA
Role Category: Software Development
Role: Technical Lead
Employement Type: Full time

Contact Details:

Company: Idfy
Location(s): Mumbai

+ View Contactajax loader


Keyskills:   System architecture Data management Image processing OOAD OLAP Agile methodology Test cases Apache Analytics SQL

 Job seems aged, it may have been expired!
 Fraud Alert to job seekers!

₹ Not Disclosed

Similar positions

Tech Lead

  • Cognizant
  • 5 - 9 years
  • Hyderabad
  • 2 days ago
₹ Not Disclosed

React JS Developer -UI Developer (6month Contract)

  • Accenture
  • 5 - 8 years
  • Bengaluru
  • 2 days ago
₹ 9-12 Lacs P.A.

Application Developer-Cloud FullStack

  • IBM
  • 3 - 5 years
  • Pune
  • 2 days ago
₹ Not Disclosed

Lead Software Engineer - React, Node.js, Java

  • JPMorgan Chase Bank
  • 0 - 7 years
  • Bengaluru
  • 2 days ago
₹ Not Disclosed

Idfy

IDfy