Role & responsibilities
MUST HAVE MANDATORY Skills
5+ years of experience with Python & Data Engineering
5+ Years of experience in SQL or PL/SQL writing complex queries & should have experience with IBM DB2
Hands on experience with Snowflake & Python is must
Hands on experience Spark (PySpark) for data loading and complex transformations is a must.
Experience with Airflow is a major plus Minimum Skills Required:
Bachelors degree in Computer Science, Software Engineering, Information Technology, or related field required.
At least 5+ years of experience in data development and solutions in highly complex data environments with large data volumes.
At least 5+ years of SQL / PLSQL experience with the ability to write ad-hoc and complex queries to perform data analysis.
At least 5+ years of experience developing complex stored procedures, triggers, MQTs and views on IBM DB2.
Experience with performance tuning DB2 tables, queries, and stored procedures.
An understanding of E-R data models (conceptual, logical, and physical).
Strong understanding of advanced data warehouse concepts (Factless Fact Tables, Temporal Bi-Temporal models, etc.).
Experience with developing data transformations using DBT a plus.
Experience with Snowflake a must.
Experience with Airflow a plus.
Experience with using Spark (PySpark) for data loading and complex transformations a must.
Strong analytical skills, including a thorough understanding of how to interpret customer business requirements and translate them into technical designs and solutions.
Strong communication skills both verbal and written