Design, develop, and maintain ETL workflows, reusable components, and data integration interfaces.
Write and optimize stored procedures and complex SQL queries in DB2 and Sybase.
Perform data modeling, database design, and performance tuning to ensure efficient data processing.
Develop and maintain scripts using Python, shell scripting, Perl, and Unix commands.
Collaborate with cross-functional teams to understand data requirements and deliver scalable data solutions.
Ensure data quality, consistency, and integrity across multiple database platforms.
Develop APIs to enable seamless integration across systems (hands-on experience preferred).
Assist in automation and process improvements to enhance operational efficiency.
Support Business Intelligence tools and Source-to-Pay applications such as SAP Ariba, and Accounts Payable systems (good to have).
Mentor junior team members and promote continuous learning and best practices.
Required Skills:
Strong experience with ETL tools , especially Informatica and Python scripting.
Hands-on experience in DB2 and Sybase database platforms.
Expertise in data warehousing concepts , relational database design, and performance tuning.
Proficient in writing stored procedures, query optimization, and troubleshooting complex database issues.
Strong scripting skills with Python , shell scripting , and Perl .
Good understanding of Unix/Linux environments.
Familiarity with PostgreSQL and Python is a plus.
Experience in API development and integration is a significant advantage.
Exposure to Business Intelligence tools and Source-to-Pay applications like SAP Ariba is desirable.
Qualifications:
Bachelors or Masters degree in Computer Science , Information Technology, or related field.
7+ years of relevant experience in data integration, ETL development, and database management.
Keyskills: ETL with Python