Responsibilities :- Design and Implement scalable solutions for ever-increasing data volumes, using big data/cloud technologies like Pyspark, Kafka, etc.- Collaborate with cross-functional teams to understand data requirements and provide effective solutions.- Implement real-time data ingestion and processing solutions.- Develop and maintain ETL/ELT processes to support data analytics and reporting.- Implement best practices for data security, integrity, and quality.- Optimize and troubleshoot data-related issues for seamless operations.
Requirements :- Bachelor's in Engineering / Master's degree in Computer Science, Information Systems or related field.- Minimum of 6-8 years of experience in data engineering.- Experience with databases and data warehousing concepts (Preferred PostGreSQL, Snowflake).- Strong SQL skills with experience in writing complex queries.- Working knowledge of Data warehousing, Data modelling, Governance, and Data Architecture.- Ability to handle large scale structured and unstructured data from internal and third-party sources.- Hands On Experience in Python, Pyspark, Kafka.- Experience with data engineering tools/technologies in GCP Cloud environment.- Proficiency in designing and maintaining scalable data architectures.- Experience with CI/CD Tools like GitHub.
Keyskills: Data Engineering Data Ingestion PostGreSQL GCP PySpark Snowflake Big Data Kafka Data Warehousing ETL SQL Python
Techstar group TechStar Group is a niche Product Development, Business Solutions, and global IT Services firm based in Dallas, TX. At TechStar, we help drive Customer Success by leveraging our expertise in IT Solutions and Business Process improvements.