Job Description
Principal AWS Data Engineer
Location : Bangalore
Experience : 9 - 12 years
Job Summary: In this key leadership role, you will lead the development of foundational components for a Lakehouse architecture on AWS and drive the migration of existing data processing workflows to the new Lakehouse solution. You will work across the Data Engineering organisation to design and implement scalable data infrastructure and processes using technologies such as Python, PySpark, EMR Serverless, Iceberg, Glue and Glue Data Catalog. The main goal of this position is to ensure successful migration and establish robust data quality governance across the new platform, enabling reliable and efficient data processing. Success in this role requires deep technical expertise, exceptional problem-solving skills, and the ability to lead and mentor within an agile team.
Must Have Tech Skills:
Prior Principal Engineer experience, leading team best practices in design, development, and implementation, mentoring team members, and fostering a culture of continuous learning and innovation
Extensive experience in software architecture and solution design, including microservices, distributed systems, and cloud-native architectures.
Expert in Python and Spark, with a deep focus on ETL data processing and data engineering practices.
Deep technical knowledge of AWS data services and engineering practices, with demonstrable experience of implementing data pipelines using tools like EMR, AWS Glue, AWS Lambda, AWS Step Functions, API Gateway, Athena
Experience of delivering Lakehouse solutions/architectures
Nice To Have Tech Skills:
Knowledge of additional programming languages and development tools to provide flexibility and adaptability across varied data engineering projects
A masters degree or relevant certifications (e.g., AWS Certified Solutions Architect, Certified Data Analytics) is advantageous
Key Accountabilities:
- Lead complex projects autonomously, fostering an inclusive and open culture within development teams. Mentor team members and lead technical discussions.
- Provides strategic guidance on best practices in design, development, and implementation. Leads the development of high-quality, efficient code and develops necessary tools and applications to address complex business needs
- Collaborates closely with architects, Product Owners, and Dev team members to decompose solutions into Epics, leading the design and planning of these components.
- Drive the migration of existing data processing workflows to a Lakehouse architecture, leveraging Iceberg capabilities.
- Serves as an internal subject matter expert in software development, advising stakeholders on best practices in design, development, and implementation
Key Skills:
- Deep technical knowledge of data engineering solutions and practices. Expert in AWS services and cloud solutions, particularly as they pertain to data engineering practices
- Extensive experience in software architecture and solution design
- Specialized expertise in Python and Spark
- Ability to provide technical direction, set high standards for code quality and optimize performance in data-intensive environments.
- Skilled in leveraging automation tools and Continuous Integration/Continuous Deployment (CI/CD) pipelines to streamline development, testing, and deployment.
- Exceptional communicator who can translate complex technical concepts for diverse stakeholders, including engineers, product managers, and senior executives.
- Provides thought leadership within the engineering team, setting high standards for quality, efficiency, and collaboration. Experienced in mentoring engineers, guiding them in advanced coding practices, architecture, and strategic problem-solving to enhance team capabilities.
Educational Background:
- Bachelors degree in computer science, Software Engineering, or a related field is essential.
Bonus Skills:
- Financial Services expertise preferred, working with Equity and Fixed Income asset classes and a working knowledge of Indices.
Job Classification
Industry: IT Services & Consulting
Functional Area / Department: Engineering - Software & QA
Role Category: Software Development
Role: Data Engineer
Employement Type: Full time
Contact Details:
Company: Cognizant
Location(s): Bengaluru
Keyskills:
continuous integration
aws iam
api gateway
data
glue
ci/cd
pyspark
data pipeline
emr
microservices
solution design
spark
cloud applications
etl
architecture
cd
python
data services
data processing
serverless
engineering
data engineering
aws lambda
aws glue
athena
aws