Lead Engineer (Databricks)
Hyderabad, Telangana, India
2 months ago
Applicants: 0
Share
3 weeks left to apply
Job Description
Experience needed: ?6-9 years Type: ?Full-Time Mode: ?WFO (5 days office, no Hybrid/Remote possibility) Shift:? IST Location: ?Hyderabad / Pune / Coimbatore Notice Period: ?Immediate to 15 days or someone serving Notice under 30 days About the Role We are looking for a highly skilled Lead Databricks Engineer to lead the design, development, and optimization of modern data engineering and analytics solutions on the Databricks Lakehouse platform . The ideal candidate will have strong expertise in data architecture, ETL/ELT pipeline development, and performance optimization using Databricks (SQL, PySpark, Delta Lake, MLflow), along with leadership experience guiding teams through solution design, implementation, and delivery. Key Responsibilities 1. Databricks Project Leadership Lead and manage end-to-end Databricks implementation projects ? from requirement gathering, architecture design, and environment setup to deployment and operationalization. Serve as the primary technical point of contact for stakeholders, data engineering teams, and cloud platform teams. Ensure project alignment with organizational goals, data governance, and performance standards. 2. Architecture & Solution Design Design scalable, secure, and high-performing data pipelines leveraging Databricks, Delta Lake , and cloud-native services (Azure, AWS, or GCP). Define robust data ingestion, transformation, and consumption architectures integrating multiple data sources (databases, APIs, streams, files, etc.). Establish CI/CD pipelines for Databricks notebooks and jobs (using Git, Azure DevOps, or similar tools). 3. Technical Implementation & Optimization Develop and maintain ETL/ELT pipelines using PySpark, SQL, and Databricks workflows . Configure clusters, optimize compute utilization, and fine-tune query performance for efficiency and cost optimization. Implement data quality frameworks , monitoring solutions, and automated error-handling logic. 4. Data Governance & Operations Set up role-based access control (RBAC) , security policies, and data lineage using Unity Catalog . Collaborate with CloudOps and DevOps teams for environment setup, monitoring, and CI/CD integration. Drive documentation, technical training, and best practices across engineering teams. Requirements Required Skills & Experience 6?9 years of total experience in data engineering, data platform, or big data projects . Minimum 5 years of hands-on experience with Databricks (SQL, PySpark, Delta Lake, MLflow) . Strong expertise in ETL/ELT pipeline design , data modeling , and data warehousing concepts . Experience with Azure Data Factory, AWS Glue, or similar orchestration tools . Proficiency in Python, SQL , and Spark optimization techniques . Experience implementing DevOps practices (CI/CD, Git, Databricks Repos) . Working knowledge of data governance frameworks , Unity Catalog , and RBAC . Proven ability to lead technical teams and deliver complex projects successfully. Preferred Qualifications Databricks Certified Data Engineer Professional or Solution Architect certification . Exposure to AI/ML workflows and model management using MLflow . Experience integrating with Power BI, Tableau, or Looker for data visualization. Strong communication and presentation skills, with experience in client-facing engagements . Soft Skills Excellent problem-solving, analytical, and leadership abilities. Strong collaboration and mentoring mindset. Ability to thrive in fast-paced, cross-functional environments . Clear and structured documentation and reporting skills.
Required Skills
Additional Information
- Company Name
- Rubis Software Solutions Pvt Ltd
- Industry
- N/A
- Department
- N/A
- Role Category
- N/A
- Job Role
- Mid-Senior level
- Education
- No Restriction
- Job Types
- On-site
- Gender
- No Restriction
- Notice Period
- Less Than 30 Days
- Year of Experience
- 1 - Any Yrs
- Job Posted On
- 2 months ago
- Application Ends
- 3 weeks left to apply
Similar Jobs
Quick Apply
Upload your resume to apply for this position