Job Title: Snowflake Engineer Shift Timing - 12 PM to 9 PM Location - Gurgaon/ Bangalore/ Pune (Hybrid) Job SummarySnowflake data engineers/developers will be responsible for architecting and
implementing very large-scale data intelligence solutions around Snowflake
Data Warehouse.A solid experience and understanding of architecting, designing and
operationalization of large-scale data and analytics solutions on Snowflake
Cloud Data Warehouse is a must. Basic understanding of: Batch and Real time data processing Agile Process (Scrum cadences, Roles, deliverables) & working
experience in either Azure DevOps, JIRA or Similar Strong In/should have: Designing and implementing a fully operational production grade large
scale data solution on Snowflake Data Warehouse. Hands on experience with building productionized data ingestion and
processing pipelines using Java (or Spark, Scala, Python etc). Expertise and excellent understanding of Snowflake Internals and
integration of Snowflake with other data processing and reporting technologies. Experience in writing stored procedure/function using Snowflake
Scripting or Javascript. Deploying Snowflake features such as data sharing, events, and
lake-house patterns Leveraging Snowflake utilities, SnowSQL, SnowPipe, and Big Data
modeling techniques using Python Data Migration from RDBMS to Snowflake cloud data warehouse NoSQL data stores, methods, and approaches (star and snowflake,
dimensional modeling) Developing ETL workflows/pipelines in and out of data warehouse using
combination of Python or Snow pipe. Understanding data pipelines and modern ways of automating data
pipeline using cloud-based implementations Basic understanding of Snowflake architecture (Storages and
Warehouses) Good understanding of RBAC (Role Based access Controls) for user and
access management. Good understanding of Structured and semi-structured data formats
(xml, ORC, json, Parquet, Avro). Good experience in creating the file formats,
external and internal stages to connect to Cloud storages (AWS S3, Azure BLOB
etc), zero copy clone. Good understanding of concepts around Time Travel and Failsafe
features in Snowflake. Translate requirements for BI and Reporting to Database design and
reporting design. Understanding data transformation and translation requirements and
which tools to leverage to get the job done Good to have basic knowledge on Snowflake Billing, setting up
Resource Monitors etc. Good to have knowledge on other cloud Data warehousing solutions like
AWS Redshift, Google big query.KEYWORDS: SNOWFLAKE, SNOWPIPE, ETL, DATAWAREHOUSE, TEST PLANNING, BUG
REPORTING, SYSTEM TEST, TEST LEAD, TEST MANAGER
Keyskills: Data modeling RDBMS Database design XML Javascript Agile JSON Scrum Data warehousing Python