Bestkaam Logo
Prodapt Logo

Data Architect

Chennai, Tamil Nadu, India

4 days ago

Applicants: 0

Salary Not Disclosed

3 weeks left to apply

Job Description

Overview Role Overview We are looking for a seasoned Data Architect with strong expertise in Databricks to lead the design, development, and optimization of scalable data platforms and analytics solutions. The role involves defining end-to-end data architecture, building cloud-native data pipelines, and enabling advanced analytics and AI workloads for enterprise environments. Key Responsibilities Define enterprise data architecture, including data ingestion, transformation, storage, governance, and consumption layers. Lead the design and implementation of Databricks Lakehouse architecture, Delta Lake, Unity Catalog, and optimized ETL/ELT pipelines. Develop scalable data models, metadata frameworks, and integration patterns across structured and unstructured datasets. Collaborate with data engineering, analytics, ML, and business teams to understand data needs and translate them into architectural solutions. Define best practices for data quality, lineage, cataloging, security, and lifecycle management. Drive cloud-based data modernization using Azure/AWS/GCP + Databricks. Establish data platform governance, including RBAC, data privacy, and compliance controls. Optimize data performance, storage costs, pipeline reliability, and cluster usage. Review and guide implementation of notebooks, workflows, Delta Live Tables, and ML/AI workloads. Create architecture artifacts including HLDs, LLDs, technology standards, and integration patterns. Provide thought leadership on data strategy, migration paths, and adoption of Databricks features. Required Skills & Experience 8-12+ years of experience in data engineering/architecture, with at least 3-5+ years of hands-on Databricks experience. Strong knowledge of Databricks Lakehouse, Delta Lake, Unity Catalog, Workflows, Model Serving, and cluster management. Expertise in Python, SQL, PySpark/Spark, and distributed data processing. Experience designing cloud-native data platforms on Azure/AWS/GCP. Strong understanding of ETL/ELT frameworks, streaming data (Kafka/Kinesis/Event Hubs), and data integration patterns. Proven experience driving enterprise data platform migrations or modernization programs. Solid understanding of data modeling (3NF, Star/Snowflake), data warehousing, and performance tuning. Knowledge of security frameworks, IAM, encryption, GDPR/PII handling, and data governance practices. Experience with CI/CD for data, Infrastructure as Code (Terraform/ARM/CloudFormation), and DevOps for data pipelines. Excellent communication and stakeholder-management skills. Preferred Qualifications Databricks certifications (Data Engineer Professional, Lakehouse Architect). Experience with MLflow, feature stores, and model deployment in Databricks. Background in enterprise analytics, BI platforms, data mesh, or data product architecture. Responsibilities Role Overview We are looking for a seasoned Data Architect with strong expertise in Databricks to lead the design, development, and optimization of scalable data platforms and analytics solutions. The role involves defining end-to-end data architecture, building cloud-native data pipelines, and enabling advanced analytics and AI workloads for enterprise environments. Key Responsibilities Define enterprise data architecture, including data ingestion, transformation, storage, governance, and consumption layers. Lead the design and implementation of Databricks Lakehouse architecture, Delta Lake, Unity Catalog, and optimized ETL/ELT pipelines. Develop scalable data models, metadata frameworks, and integration patterns across structured and unstructured datasets. Collaborate with data engineering, analytics, ML, and business teams to understand data needs and translate them into architectural solutions. Define best practices for data quality, lineage, cataloging, security, and lifecycle management. Drive cloud-based data modernization using Azure/AWS/GCP + Databricks. Establish data platform governance, including RBAC, data privacy, and compliance controls. Optimize data performance, storage costs, pipeline reliability, and cluster usage. Review and guide implementation of notebooks, workflows, Delta Live Tables, and ML/AI workloads. Create architecture artifacts including HLDs, LLDs, technology standards, and integration patterns. Provide thought leadership on data strategy, migration paths, and adoption of Databricks features. Requirements Role Overview We are looking for a seasoned Data Architect with strong expertise in Databricks to lead the design, development, and optimization of scalable data platforms and analytics solutions. The role involves defining end-to-end data architecture, building cloud-native data pipelines, and enabling advanced analytics and AI workloads for enterprise environments. Key Responsibilities Define enterprise data architecture, including data ingestion, transformation, storage, governance, and consumption layers. Lead the design and implementation of Databricks Lakehouse architecture, Delta Lake, Unity Catalog, and optimized ETL/ELT pipelines. Develop scalable data models, metadata frameworks, and integration patterns across structured and unstructured datasets. Collaborate with data engineering, analytics, ML, and business teams to understand data needs and translate them into architectural solutions. Define best practices for data quality, lineage, cataloging, security, and lifecycle management. Drive cloud-based data modernization using Azure/AWS/GCP + Databricks. Establish data platform governance, including RBAC, data privacy, and compliance controls. Optimize data performance, storage costs, pipeline reliability, and cluster usage. Review and guide implementation of notebooks, workflows, Delta Live Tables, and ML/AI workloads. Create architecture artifacts including HLDs, LLDs, technology standards, and integration patterns. Provide thought leadership on data strategy, migration paths, and adoption of Databricks features. Preferred Qualifications Databricks certifications (Data Engineer Professional, Lakehouse Architect). Experience with MLflow, feature stores, and model deployment in Databricks. Background in enterprise analytics, BI platforms, data mesh, or data product architecture.

Additional Information

Company Name
Prodapt
Industry
N/A
Department
N/A
Role Category
DevOps Engineer
Job Role
Mid-Senior level
Education
No Restriction
Job Types
On-site
Gender
No Restriction
Notice Period
Less Than 30 Days
Year of Experience
1 - Any Yrs
Job Posted On
4 days ago
Application Ends
3 weeks left to apply

Similar Jobs

PwC India

2 months ago

Associate

PwC India

Data, Python, GCP +2
NTT DATA, Inc.

1 month ago

MS Engineer - Windows L1

NTT DATA, Inc.

ResourceTree Global Services Pvt Ltd

1 month ago

Azure Senior Data Engineer - 7 years

ResourceTree Global Services Pvt Ltd

Data, ADF, DBT +2
algoleap

4 days ago

DevOps Engineer

algoleap

EC2, S3, Linux +2
Virtusa

3 weeks ago

GEN AI

Virtusa

PwC India

3 weeks ago

IN_Senior Associate_Full Stack Developer_Data and Analytics_Advisory_Bangalore

PwC India

algoleap

2 months ago

Data Quality engineer

algoleap

Data, SQL, Jira +1
Uplers

3 weeks ago

Senior Data Engineer

Uplers

SITA

4 days ago

Cloud Infrastructure Engineer (Networking)

SITA

Biztoso

2 months ago

iOS Developer - SwiftUI

Biztoso