Job Description
Join us as an adequately skilled and highly motivated Data Engineer to design, build, and maintain scalable and efficient data pipelines while collaborating with cross-functional teams to deliver high-quality, actionable regulatory data solutions as well as insightful business analytics
You may be assessed on the key critical skills relevant for success in role, such as hands-on and strong modern data stack experiences in Python, SQL, DBT, AWS, S3 storage, Databricks or Snowflake
To be successful as a Data Engineer, you should have experience with:
Building scalable and efficient ETL/ELT pipelines to ingest, transform, and load data for Reg reporting / analytical use cases
Optimize Performance: Tune and optimize data pipelines, SQL queries, and data models for performance and scalability
Python Proficiency for data manipulation, integration and with Pandas or PySpark experience
Strong SQL skills for complex queries & experience with database platforms
Hands-on experience building and managing DBT models & with DBT Cloud or DBT core
Experience building and managing the object store
Experience with AWS services for data engineering (e g, S3, Glue, Lambda, Redshift)
Utilize Starburst, Databricks or Snowflake for big data processing and advanced analytics
Partner with project managers, data analysts, data scientists, and business stakeholders to deliver end-to-end data solutions
Develop monitoring and validation routines to ensure data accuracy, consistency, and security
Work closely with data architect and data modelers/ analysts to design and implement data models that support business intelligence and analytics
Desirable Skillsets / Good To Have
Familiarity with Databricks or Snowflake for big data processing and analytics is a plus
Proficient with Git for version control and collaboration
Familiarity with tools like Airflow, Dagster, or similar workflow automation platforms
Experience in working with RESTful or GraphQL APIs for data integration
Strong problem-solving and analytical skills with a focus on delivering high quality data solutions
Excellent communication and collaboration skills to work with cross-functional teams
Ability to adapt to a fast-paced and dynamic environment
Self-motivated, detail-oriented, and passionate about data engineering
This role will be based out of Nirlon Knowledge Park / Altimus office, Mumbai
Purpose of the role
To build and maintain the systems that collect, store, process, and analyse data, such as data pipelines, data warehouses and data lakes to ensure that all data is accurate, accessible, and secure
Accountabilities
Build and maintenance of data architectures pipelines that enable the transfer and processing of durable, complete and consistent data
Design and implementation of data warehoused and data lakes that manage the appropriate data volumes and velocity and adhere to the required security measures
Development of processing and analysis algorithms fit for the intended data complexity and volumes
Collaboration with data scientist to build and deploy machine learning models
Assistant Vice President Expectations
To advise and influence decision making, contribute to policy development and take responsibility for operational effectiveness
Collaborate closely with other functions/ business divisions
Lead a team performing complex tasks, using well developed professional knowledge and skills to deliver on work that impacts the whole business function
Set objectives and coach employees in pursuit of those objectives, appraisal of performance relative to objectives and determination of reward outcomes
If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard
The four LEAD behaviours are: L Listen and be authentic, E Energise and inspire, A Align across the enterprise, D Develop others
OR for an individual contributor, they will lead collaborative assignments and guide team members through structured assignments, identify the need for the inclusion of other areas of specialisation to complete assignments
They will identify new directions for assignments and/ or projects, identifying a combination of cross functional methodologies or practices to meet required outcomes
Consult on complex issues; providing advice to People Leaders to support the resolution of escalated issues
Identify ways to mitigate risk and developing new policies/procedures in support of the control and governance agenda
Take ownership for managing risk and strengthening controls in relation to the work done
Perform work that is closely related to that of other areas, which requires understanding of how areas coordinate and contribute to the achievement of the objectives of the organisation sub-function
Collaborate with other areas of work, for business aligned support areas to keep up to speed with business activity and the business strategy
Engage in complex analysis of data from multiple sources of information, internal and external sources such as procedures and practises (in other areas, teams, companies, etc)
to solve problems creatively and effectively
Communicate complex information
'Complex' information could include sensitive information or information that is difficult to communicate because of its content or its audience
Influence or convince stakeholders to achieve outcomes
All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship our moral compass, helping us do what we believe is right
They will also be expected to demonstrate the Barclays Mindset to Empower, Challenge and Drive the operating manual for how we behave
Show more Show less
Job Classification
Industry: Financial Services
Functional Area / Department: Data Science & Analytics
Role Category: Data Science & Machine Learning
Role: Data Engineer
Employement Type: Full time
Contact Details:
Company: Barclays
Location(s): Mumbai
Keyskills:
pandas
snowflake
python
git
aws
sql
communication skills