You will be responsible for providing and developing production-grade Talend Big Data jobs while ensuring alignment with IT architecture standards. Your role will involve collaborating closely with data modelers and system analysts to define data interfaces and requirement specifications. Before designing solutions, coding, and unit testing, you will conduct technical sessions and clarifications with data modelers and sub-system teams. Additionally, you will document and review the design of Talend Big Data jobs and interface specifications, implement error and exception handling, and import jobs to the Talend Data Catalog. You will also integrate Talend jobs with Autosys, as well as restart failed jobs from Autosys. Your responsibilities will include performing technical impact assessments, source code releases, deployment checklists, and validating conformance to required specifications. Furthermore, you should be capable of self-learning and picking up application setup and support from vendors. Collaboration with peers and vendors to develop, set up, and support IFRS17 application and data integration will be essential. You will also be involved in supporting System Integration Testing (SIT) and User Acceptance Testing (UAT) activities, such as defect analysis, troubleshooting, and fixing. Coordinating and supporting Performance and Security Testing activities, including environment setup and test scope, will be part of your role. You will work closely with the infrastructure team on deployment and related activities and provide enhancements and production support post-project go-live. In terms of requirements, you should have a minimum of 4 years of experience in Data Warehousing, 3+ years of experience as a Hadoop (Hortonworks) developer focusing on Spark and Hive (especially Spark version 2), and 3-5 years of experience deploying code to production in Talend Big Data version 7 or higher. Additionally, you should have at least 2 years of experience working with Talend Big Data version 7 on Hadoop. Experience in designing Talend job orchestration through enterprise workload automation tools like Control-M (preferably Autosys) is desired. You should possess working knowledge of the Hortonworks data platform for data ingestion frameworks from various source systems (e.g., AS400, Oracle Finance, MS SQL, etc.). Moreover, you are expected to have development experience using Java, PL/SQL, SQL, Python, and Scala, along with a solid understanding of data models and data flows in dimensional and relational databases. Knowledge of stored procedures, constraints, normalization, indexes, and security is essential. Any familiarity with the insurance and financial reporting domain will be considered advantageous.,
Employement Category:
Employement Type: Full timeIndustry: IT Services & ConsultingRole Category: Software / General ITFunctional Area: Not SpecifiedRole/Responsibilies: Talend Big Data Developer
Contact Details:
Company: Helius TechnologiesLocation(s): All India