Your browser does not support javascript! Please enable it, otherwise web will not work for you.

Data Architect @ Accenture

Home > DBA / Data warehousing

 Data Architect

Job Description

Project Role Data ArchitectProject Role Description Define the data requirements and structure for the application. Model and design the application data structure, storage and integration. Must have skills Databricks Unified Data Analytics PlatformGood to have skills NAMinimum 7.5 year(s) of experience is requiredEducational Qualification 15 years full time educationSummaryAs a Data Architect, you will define the data requirements and structure for the application. Your typical day will involve modeling and designing the application data structure, storage, and integration, ensuring that the data architecture aligns with the overall business objectives and technical specifications. You will collaborate with various teams to ensure that the data architecture is robust, scalable, and efficient, while also addressing any challenges that arise during the development process. Your role will be pivotal in shaping the data landscape of the organization, enabling data-driven decision-making and fostering innovation through effective data management practices. Responsibilities:Develop high-quality, scalable ETL/ELT pipelines using Databricks technologies including Delta Lake, Auto Loader, and DLT.Excellent programming and debugging skills in Python.Strong hands-on experience with Py Spark to build efficient data transformation and validation logic.Must be proficient in at least one cloud platformAWS, GCP, or Azure.Create modular DBX functions for transformation, PII masking, and validation logic reusable across DLT and notebook pipelines.Implement ingestion patterns using Auto Loader with checkpointing and schema evolution for structured and semi-structured data.Build secure and observable DLT pipelines with DLT Expectations, supporting Bronze/Silver/Gold medallion layering.Configure Unity Catalogset up catalogs, schemas, user/group access, enable audit logging, and define masking for PII fields.Enable secure data access across domains and workspaces via Unity Catalog External Locations, Volumes, and lineage tracking.Access and utilize data assets from the Databricks Marketplace to support enrichment, model training, or benchmarking.Collaborate with data sharing stakeholders to implement Delta Sharing both internally and externally.Integrate Power BI/Tableau/Looker with Databricks using optimized connectors (ODBC/JDBC) and Unity Catalog security controls.Build stakeholder-facing SQL Dashboards within Databricks to monitor KPIs, data pipeline health, and operational SLAs.Prepare Gen AI-compatible datasetsmanage vector embeddings, index with Databricks Vector Search, and use Feature Store with ML flow.Package and deploy pipelines using Databricks Asset Bundles through CI/CD pipelines in GitHub or GitLab.Troubleshoot, tune, and optimize jobs using Photon engine and serverless compute, ensuring cost efficiency and SLA reliability.Experience with cloud-based services relevant to data engineering, data storage, data processing, data warehousing, real-time streaming, and serverless computing.Hands on Experience in applying Performance optimization techniquesUnderstanding data modeling and data warehousing principles is essential.Nice to Have:1.CertificationsDatabricks Certified Professional or similar certifications.2.Machine LearningKnowledge of machine learning concepts and experience with popular ML libraries.3.Knowledge of big data processing (e.g., Spark, Hadoop, Hive, Kafka)4.Data OrchestrationApache Airflow.5.Knowledge of CI/CD pipelines and DevOps practices in a cloud environment.6.Experience with ETL tools like Informatica, Talend, Mati Llion, or Five Tran.7.Familiarity with DBT (Data Build Tool)Additional Information- The candidate should have minimum 7.5 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Bengaluru office.Educational Qualification:- 15 years full time education is required.Qualification15 years full time education

Job Classification

Industry: IT Services & Consulting
Functional Area / Department: Engineering - Software & QA
Role Category: DBA / Data warehousing
Role: Data warehouse Architect / Consultant
Employement Type: Full time

Contact Details:

Company: Accenture
Location(s): Bengaluru

+ View Contactajax loader


Keyskills:   pyspark data modeling gcp python microsoft azure hive continuous integration data warehousing ci/cd sql build automation spark devops debugging hadoop github data analytics talend power bi elt warehouse data bricks tableau kafka gitlab informatica

 Fraud Alert to job seekers!

₹ Not Disclosed

Similar positions

Data Architect

  • Accenture
  • 15 - 20 years
  • Chennai
  • 3 days ago
₹ Not Disclosed

Data Architect

  • Accenture
  • 15 - 20 years
  • Bengaluru
  • 3 days ago
₹ Not Disclosed

Data Architect

  • Accenture
  • 15 - 20 years
  • Bengaluru
  • 3 days ago
₹ Not Disclosed

Data Architect

  • Accenture
  • 15 - 20 years
  • Bengaluru
  • 3 days ago
₹ Not Disclosed

Accenture

Accenture in India