Your browser does not support javascript! Please enable it, otherwise web will not work for you.

Senior Data Engineer @ Jeavio

Home > Data Science & Analytics - Other

 Senior Data Engineer

Job Description

We are seeking an experienced Senior Data Engineer to join our team. The ideal candidate will have a strong background in data engineering and AWS infrastructure, with hands-on experience in building and maintaining data pipelines and the necessary infrastructure components. The role will involve using a mix of data engineering tools and AWS services to design, build, and optimize data architecture.


Key Responsibilities:

  • Design, develop, and maintain data pipelines using Airflow and AWS services.
  • Implement and manage data warehousing solutions with Databricks and PostgreSQL.
  • Automate tasks using GIT / Jenkins.
  • Develop and optimize ETL processes, leveraging AWS services like S3, Lambda, AppFlow, and DMS.
  • Create and maintain visual dashboards and reports using Looker.
  • Collaborate with cross-functional teams to ensure smooth integration of infrastructure components.
  • Ensure the scalability, reliability, and performance of data platforms.
  • Work with Jenkins for infrastructure automation.

Technical and functional areas of expertise:

  • Working as a senior individual contributor on a data intensive project
  • Strong experience in building high performance, resilient & secure data processing pipelines preferably using Python based stack.
  • Extensive experience in building data intensive applications with a deep understanding of querying and modeling with relational databases preferably on time-series data.
  • Intermediate proficiency in AWS services (S3, Airflow)
  • Proficiency in Python and PySpark
  • Proficiency with ThoughtSpot or Databricks.
  • Intermediate proficiency in database scripting (SQL)
  • Basic experience with Jenkins for task automation

Nice to Have :

  • Intermediate proficiency in data analytics tools (Power BI / Tableau / Looker / ThoughSpot)
  • Experience working with AWS Lambda, Glue, AppFlow, and other AWS transfer services.
  • Exposure to PySpark and data automation tools like Jenkins or CircleCI.
  • Familiarity with Terraform for infrastructure-as-code.
  • Experience in data quality testing to ensure the accuracy and reliability of data pipelines.
  • Proven experience working directly with U.S. client stakeholders.
  • Ability to work independently and take the lead on tasks.

Education and experience:

  • Bachelors or masters in computer science or related fields.
  • 5+ years of experience

Stack/Skills needed:

  • Databricks
  • PostgreSQL
  • Python & Pyspark
  • AWS Stack
  • Power BI / Tableau / Looker / ThoughSpot
  • Familiarity with GIT and/or CI/CD tools

Job Classification

Industry: IT Services & Consulting
Functional Area / Department: Data Science & Analytics
Role Category: Data Science & Analytics - Other
Role: Data Science & Analytics - Other
Employement Type: Full time

Contact Details:

Company: Jeavio
Location(s): Vadodara

+ View Contactajax loader


Keyskills:   Airflow Pyspark AWS Data Bricks Python Etl Pipelines Database Scripting Postgresql Looker SQL

 Job seems aged, it may have been expired!
 Fraud Alert to job seekers!

₹ Not Disclosed

Similar positions

Sr.Data Governance/Data Quality Consultant

  • Trigyn Technologies
  • 9 - 14 years
  • Mumbai
  • 17 hours ago
₹ Not Disclosed

Snowflake Data Engineer

  • Brillio
  • 5 - 9 years
  • Pune
  • 1 day ago
₹ Not Disclosed

Snowflake Data Engineer

  • Brillio
  • 5 - 9 years
  • Pune
  • 1 day ago
₹ Not Disclosed

Snowflake Data Engineer

  • Brillio
  • 5 - 9 years
  • Pune
  • 2 days ago
₹ Not Disclosed

Jeavio

Company DetailsJeavio