Your browser does not support javascript! Please enable it, otherwise web will not work for you.

Data Architect @ Naukri

Home > Business Intelligence & Analytics

 Data Architect

Job Description

Job Title: Data Architect

Work Location: Hyderabad (Opp. IKEA)

Job Type: Full-Time/Permanent

Work Mode: Hybrid


Job Description:


ROLES AND RESPONSIBILITIES:


Develop and implement a comprehensive Data engineering solutions/accelerators and Data Architecture solutions align with our business goals and objectives.

Architect, create, maintain, manage, and optimize data pipelines as workloads move from development to production for specific use cases.

Lead the design, development, and deployment of Data solutions

Collaborate with stakeholders to identify data requirements and develop data models and data flow diagrams

Understanding of analytics and machine learning concepts and tools.

Build and manage a high-performing team of Data engineers and provide guidance and mentorship to the Data team

Stay updated on the latest trends and developments in the Data domain

Responsible for using innovative and modern tools, techniques and architectures to partially or completely automate the most-common, repeatable and tedious data preparation and integration tasks in order to minimize manual and error-prone processes and improve productivity.

Assist with renovating the data management infrastructure to drive automation in data integration and management.

Develop algorithms and predictive models to solve critical business problems.

Develop tools and libraries that will help analytics team members more efficiently interface with huge amounts of data.

Analyze large, noisy datasets and identify meaningful patterns that provide actionable results.

Develop and automate new enhanced imputation algorithms.

Work with data lakes and databases including any aspect of administration or support required to maintain them


TECHNICAL COMPETENCIES (Knowledge, Skills & Abilities)


Excellent at solving analytical problems using quantitative approaches.

Comfort manipulating and analyzing complex, high-volume, high-dimensionality data from varying sources

Must have experience at least two to three end to end implementation of Snowflake cloud data warehouse

Strong programming background, Strong coding skills in the following: Python, Pyspark,Databricks and SQL

Strong knowledge of DevOps workflow, cloud-native platforms (containers, Kubernetes, serverless, etc.) and tools (such as Git).

Expertise in Snowflake advanced concepts like setting up resource monitors, RBAC controls, virtual warehouse sizing, query performance tuning, Zero copy clone, time travel and understand how to use these features

Expertise in deploying Snowflake features such as data sharing, events and lake-house patterns

Hands-on experience with Snowflake utilities, SnowSQL, SnowPipe, Big Data model techniques using Python

Experience in Data Migration from RDBMS to Snowflake cloud data warehouse

Deep understanding of relational as well as NoSQL data stores, methods and approaches (star and snowflake, dimensional modelling)

Experience with data security and data access controls and design

Strong ability to collaborate with a variety of roles and harmonize the relationship between business and IT.

Strong passion for empirical research and for answering hard questions with data.

Flexible analytic approach that allows for results at varying levels of precision.

Experience in building solutions with Data solutions in Manufacturing, Retail and Supply chain Operations

Proficient in data integration techniques to combine data from various sources into a centralized location

Work with cross-functional teams to ensure that data is integrated, transformed, and loaded effectively across different platforms and systems

Data Science, Analytics and Machine Learning specializing in implementing Machine learning models is an added advantage


EDUCATION AND EXPERIENCE


Bachelors degree in computer science, mathematics, or statistics

12+ Years of of Data and Analytics experience with minimum 5+ years in Snowflake Cloud Data warehousing.

10+ years of experience in the IT with strong focus on Data engineering,Data Architecture

5+ years of relevant experience in successfully launching, planning, and executing Advanced Analytics projects.

5+ years of experience in managing or leading data engineering teams

5+ years of hands-on experience building Cloud Data centric solutions preferably on Azure cloud platform or any other Hyper-scalers.

Strong programming skills in SQL, Python, and Python-based data manipulation and visualization libraries.

Familiarity with big data frameworks, such as Spark, and ML libraries

Extensive experience in Snowflake - Virtual warehouse (Compute), Data modeling & storage, data loading/unloading and data sharing, SnowSQL (CLI) as well as Snowflake Internals and integrations, SnowPipe implementation, Snowflake security including readers and consumers accounts.

Advance SQL knowledge and hands on experience on complex queries writing using with Analytical functions, Troubleshooting, problem solving and performance tuning of SQL queries accessing data warehouse as well as Strong knowledge on stored procedures

Design and build production data pipelines from ingestion to consumption with Python, Pyspark or Scala.

Good knowledge on orchestration/scheduler tools, like Airflow

Strong understanding of cloud architecture and environment

Familiar with common techniques and general ML programming to ensure quality of algorithms before pushing them to production

Experience with BI and Reporting Tools

Experience with Data Integration tools like Spark/Databricks

Experience in Linux/Unix shell scripting

Good experience in Requirements Analysis and Solution Architecture Design, Data modelling, ETL, data integration and data migration design

Well versed with Waterfall, Agile, Scrum and similar project delivery methodologies.

Experienced in internal as well as external stakeholder management

Understanding of cloud data migration processes and ability to design alternatives for data ingestion from variety of sources into various databases

Build robust and scalable data integration (ETL) pipelines

Build and deliver high quality datasets to support business analyst and customer reporting needs

Interface with business customers, gathering requirements and delivering end to end solutions

Problem-solving, communication, and collaboration skills.

Excellent data analytical and communication skills.

Ability to work in a fast-paced, high-pressure, agile environment.

Strong interpersonal, communication, and presentation skills.

Ability to learn and teach new languages and frameworks

PHYSICAL REQUIREMENTS / WORK ENVIRONMENT (if applicable)

Work is performed in a designated professional office workstation and environment.

Extensive use of office equipment to include computer, calculator, copier, fax, and other business related machines and software. 80% Hands on role on the developments in Python, Snowflake and Solution design and 20% mentoring the team

Job Classification

Industry: Software Product
Functional Area / Department: Data Science & Analytics
Role Category: Business Intelligence & Analytics
Role: BI Architect
Employement Type: Full time

Contact Details:

Company: Naukri
Location(s): Hyderabad

+ View Contactajax loader


Keyskills:   Snowflake Python SQL DESIGNING Data Architecture Solutioning

 Fraud Alert to job seekers!

₹ Not Disclosed

Similar positions

Data Engineer - Business Intelligence

  • IBM
  • 4 - 6 years
  • Pune
  • 22 hours ago
₹ Not Disclosed

Data and Reporting Lead

  • Branch International
  • 5 - 10 years
  • Ajmer
  • 1 day ago
₹ Not Disclosed

Manager - Data Science (Gen AI)

  • Blend360 India
  • 8 - 11 years
  • Hyderabad
  • 3 days ago
₹ Not Disclosed

Data Engineer, Product Analytics

  • Meta
  • 2 - 7 years
  • Bengaluru
  • 4 days ago
₹ Not Disclosed

Naukri

Company DetailsNaukripay group