Desired Candidate Profile
The Information Management team is responsible for delivering data solutions that support all lines of business across the organization. - This includes providing data integration services for all batch data movement; managing and enhancing the data warehouse, Data Lake and dependent data marts; and providing support for analytics and business intelligence consumers.
We're looking for a Principal Data Engineer position that has hands-on experience and intimate knowledge of OLAP concepts and big data experience. - This key role will be out of our Global Development Center in Bangalore, India and responsible for the successful delivery of multiple data warehousing initiatives and with - one data- platform. - This is a unique career opportunity for a highly experienced engineer with broad based data skills in data warehouses, data lakes and real-time and batch data integrations. - The successful candidate should have hands-on implementation experience in Big Data technologies, event processing frameworks and ETL tools.
Responsibilities :
- Design secure, robust, fault-tolerant and highly scalable data management systems and solutions
- Develop and unit test data solutions, data integrations, data services
- Use agile engineering practices and various data development technologies to rapidly develop creative and efficient data products
- Design highly automated processes and data flows
- Identify inefficiencies, optimize processes and data flows and make recommendations for improvements
- Align and integrate well with architects, data analysts, data modellers and other stakeholders
- Communicate with other developers across teams, both as ad hoc problem solving, and check-ins and discussions with other initiatives
- Support non-technical team members in understanding the technical implications of design decisions
- Manage deliverables of junior developers
Qualifications :
- Core Technical Requirements of the role
- Minimum 10 years of data engineering experience, with knowledge of Agile development process
- Experience developing large scale MDM, data warehousing, data lake and data integration - projects
- Experience in creating and maintaining ETL processes and architecting complex data pipelines
- Experience designing and developing complex multi-tiered application with Services Oriented Architecture (SOA)
- Hands-on implementation experience in Big Data (i.e. Hadoop or equivalent)
- Hands-on experience with Kafka or other data streams
- Extensive knowledge of data warehouse principles, design and concepts
- Expert in SQL
- Adept in tracing and resolving data integrity issues
- Experience managing offshore development teams
- Experience in performance monitoring and performance tuning
- Experience in financial domain is a plus
Contact Details:
Keyskills:
big data
data modeling
data integration
business intelligence
data processing
data warehousing
Spark