Your browser does not support javascript! Please enable it, otherwise web will not work for you.

Data Engineer - GCP @ Digyug HR

Home >

 Data Engineer - GCP

Job Description

    Explore your next opportunity at a Fortune Global 500 organization. Envision innovative possibilities, experience our rewarding culture, and work with talented teams that help you become better every day. We know what it takes to lead UPS into tomorrowpeople with a unique combination of skill + passion. If you have the qualities and drive to lead yourself or teams, there are roles ready to cultivate your skills and take you to the next level. Job Description: JOB SUMMARY This position develops batch and real-time data pipelines utilizing various data analytics processing frameworks in support of Data Science and Machine Learning practices. This position assists in the integration of data from various data sources, both internal and external. This position performs extract, transform, load (ETL) data conversions, and facilitates data cleansing and enrichment as well as performs full systems life cycle management activities, such as analysis, technical requirements, design, coding, testing, implementation of systems and applications software. This position assists in synthesizing disparate data sources to create reusable and reproducible data assets. This position assists the Data Science community working through analytical model feature tuning. RESPONSIBILITIES Contributes to data engineering projects and builds solutions by leveraging a foundational knowledge in software/application development, literate in the programming languages used for statistical modeling and analysis, data warehousing and Cloud solutions, and building data pipelines. Collaborates effectively, produces data engineering documentation, gathers requirements, organizes data and defines the scope of a project. Performs data analysis and presents findings to the stakeholders to support business needs. Participates in the integration of data for data engineering projects. Understands and utilizes analytic reporting tools and technologies. Assists with data engineering maintenance and support. Assists in defining the data interconnections between organizations operational and business functions. Assists in backup and recovery and utilizes technology solutions to perform POC analysis. QUALIFICATIONS Requirements: Understanding of database systems and data warehousing solutions. Understanding of data life cycle stages - data collection, transformation, analysis, storing the data securely, providing data accessibility Understanding of the data environment to ensure that it can scale for the following demands: Throughput of data, increasing data pipeline throughput, analyzing large amounts of data, real-time predictions, insights and customer feedback, data security, data regulations, and compliance. Contributes to the following: Building a data platform, ensuring data is secure in motion and at rest, automating data compliance and auditing, data warehousing solutions for scalable analytics. Familiarity with analytics reporting technologies and environments (e.g., PBI, Looker, Qlik, etc.) Basic knowledge of algorithms and data structures to assist in understanding the big picture of the organizations overall data function. Knowledge in data filtering and data optimization. Familiarity with a Cloud services platform (e.g., GCP, or AZURE, or AWS) and all the data life cycle stages. Understanding ETL tools capabilities. Ability to pull data from various sources, perform a load of the transformed data into a database or business intelligence platform. Familiarity with Machine learning algorithms which help data scientists make predictions based on current and historical data. Knowledge of algorithms and data structures with the ability to organize the data for reporting, analytics, and data mining and perform data filtering and data optimization. Ability to build data APIs to enable data scientists and business intelligence analysts to query the data. Ability to code using programming language used for statistical analysis and modeling such as Python/Java/Scala/C++. Understanding the basics of distributed systems. A Bachelors degree in MIS or mathematics, statistics, or computer science, international equivalent, or equivalent job experience. Employee Type: Permanent UPS is committed to providing a workplace free of discrimination, harassment, and retaliation.,

Employement Category:

Employement Type: Full time
Industry: IT Services & Consulting
Role Category: Not Specified
Functional Area: Not Specified
Role/Responsibilies: Data Engineer - GCP

Contact Details:

Company: UPS India
Location(s): Chennai

+ View Contactajax loader


Keyskills:   Data Engineering Data Analytics Machine Learning ETL Data Warehousing Statistical Modeling Data Cleansing Data Enrichment Database Systems Data Security Compliance Algorithms Data Structures ETL Tools Programming Languages Distributed Systems Cloud Solutions

 Fraud Alert to job seekers!

₹ Not Disclosed

Similar positions

Cloud & AI Solution Engineer Azure

  • Junomoneta Finsol
  • 3 to 7 Yrs
  • Maharashtra
  • 1 day ago
₹ Not Disclosed

Software Test Engineer - Manual Testing

  • Jtsi Technologies
  • 4 to 8 Yrs
  • 16 hours ago
₹ Not Disclosed

Lead I - Software Testing (Manual Testing,)

  • Aditya Birla Sun Life
  • 5 to 9 Yrs
  • 1 day ago
₹ Not Disclosed

Deployment Engineer Intern - Visakhapatnam

  • Tech Mahindra Ltd.
  • 0 to 4 Yrs
  • All India
  • 1 day ago
₹ Not Disclosed

Digyug HR

Digyug is one of the leading marketing agencies based out of mumbai