Job Description
Role Data Architect - Databricks Client Trianz D&AI Practice Employment Type Employee Work location India Hyderabad or Bangalore Work mode Trianz Office - Hybrid Trianz Grade 6 / 10 15 years / Solutions Architect About Trianz ADD TRIANZ BLURB please share and review with Ben K before publishing Job Summary As a Data Architect, you are core to the D&AI (Data & AI) Practices success. Data is foundational to everything we do, and you are accountable for defining and delivering best-in-class Databricks data management solutions across all major cloud platforms. This is a senior role with high visibility and reporting to the D&AI Practice Leadership. Job Responsibilities Architectural Design: Architect secure, scalable, highly performant data engineering and management solutions, including data warehouses, data lake, ELT / ETL and real-time data engineering / pipeline solutions. Support Principal Data Architect in defining and maintaining Practice reference data engineering and data management architectures. Databricks Implementation: Design and manage scalable end-to-end data solutions leveraging native Databricks capabilities including Data Engineering Delta Live Tables, Apache Spark, Structured Streaming, Notebooks, Git CI/CD, Databricks supported file types, Libraries. Data Warehousing Databricks SQL, best practice data modeling. Delta Lake Delta Lake + Delta Live tables, incremental and streaming workloads, table history, file management, API utilization. Databricks Data Unity Catalog access control, lineage, Delta Sharing, identity management. Hyperscaler Design: Competently leverage data-related cloud platform (AWS or Azure or GCP) capabilities to architect and develop end-to-end data engineering and data management solutions. Client Engagement: Regular collaboration and partnership with clients to understand their challenges and needs then translate requirements into data solutions that drive customer value. Support proposal development. Data Modeling: Create and maintain conceptual, logical, and physical data models that support both transactional and analytical needs. Ensure data models are optimized for performance and scalability. Creativity: Be an out-of-the-box thinker and passionate about applying your skills to new and existing solutions alike while always demonstrating a customer-first mentality. Mandatory Skills 12+ years hands-on data solution architecture and implementation experience on modern cloud platforms (AWS preferred) including microservice and event-driven architectures. Databricks Professional Data Engineer certification. Databricks Platform Architect accreditation (AWS or Azure). Databricks Platform Administrator accreditation. Hands-on experience with Databricks Lakehouse implementations, Delta Live Tables and building scalable data pipelines including real-time (Kafka, Kinesis) in addition to Databricks Structured Streaming. Demonstrated, holistic experience and competence with Databricks concepts including Delta tables, Apache Spark, Structured Streaming, Notebooks, Git CI/CD, Libraries Databricks SQL. An architectural certification on either AWS, Azure or GCP. Experience with ML orchestration tools such as MLflow and Kubeflow. Containerization proficiency (Docker, Kubernetes) to deploy data pipelines and models. Practical experience with end-to-end data engineering and data management supporting functions including data modeling (conceptual, logical & physical), BI & analytics, data governance, data quality, data security / privacy / compliance, IAM, performance optimization. Advanced SQL and data profiling. Python or Scala. Strong communication skills with the ability to convey technical concepts to non-technical users. Strong, self-management skills demonstrating ability to multitask and self-manage goals and activities. Additional / Nice-to-have Qualifications Databricks ML Engineer certification (Associate or Professional) Databricks Generative AI Engineer certification Databricks Apache Spark Developer certification Any additional Databricks accreditations or certifications Required Education Master or Bachelor (CS, IT, Applied Mathematics or demonstrated experience) Why Join Us INSERT TRIANZ BLURB please share and review with Ben K before publishing Travel Requirement (%) Periodic / infrequent travel as required for In-person client meetings Company culture / focus events Training and/or alliance needs Initial Screening Questions Initial Rejection Screening Questions Does candidate have Databricks Professional Data Engineer certification If no, reject Does candidate have Databricks Platform Architect accreditation (AWS or Azure) If no, reject Does candidate have an architectural certification with AWS, Azure or GCP If no, reject Has candidate successfully designed and implemented modern cloud-based data engineering and data management solutions for more than 7 years If no, reject Does candidate have 5 or more years working directly with clients (internal or external) to collaboratively (direct conversation) develop data engineering and data management solutions If no, reject Does candidate communicate clearly in English If no, reject Prioritization Questions (Candidates with highest points should receive highest ranking in progressing) Does candidate have => 3 years prior consulting experience If yes, 1 point Does candidate have any of the nice-to-have qualifications If yes, 1 point Does candidate have an architectural certification in AWS or Azure If yes, 2 points
Job Classification
Industry: IT Services & Consulting
Functional Area / Department: Engineering - Software & QA
Role Category: DBA / Data warehousing
Role: Database Architect / Designer
Employement Type: Full time
Contact Details:
Company: Trianz
Location(s): Hyderabad
Keyskills:
Solution architecture
Publishing
Data management
Data modeling
Consulting
Data quality
Analytics
SQL
Python
Identity management