Bestkaam Logo
Jobs via Dice Logo

Senior Data Engineer Blue Ash, OH

Actively Reviewing the Applications

Jobs via Dice

Raipur Full-Time 4–8 years
Posted 3 days ago Apply by June 11, 2026

Job Description

Dice is the leading career destination for tech experts at every stage of their careers. Our client, V-CENTRIX-US LLC, is seeking the following. Apply via Dice today!

We are seeking a Senior Data Engineer with deep expertise in Azure Databricks, Spark, Python, SQL, and distributed data pipeline optimization. This is a fully onsite C2C contract role based in Blue Ash, OH.

Role: Senior Data Engineer

Type: Contract (C2C)

Location & Onsite: Blue Ash, OH (5 days onsite)

Visa: Any visa acceptable

Interview Process: In-person onsite

Team Details: 10 team members; work independently with peer programming sessions throughout the day

Top Skills: Azure Databricks, Python, Spark

Soft Skills: Problem-solving, attention to detail, ability to work independently and collaboratively in an agile team

VERY IMPORTANT DETAILS:

  • Work location must be local
  • Interviews will be in person, onsite
  • Candidates must be willing to come onsite for their interview and work fully onsite with the team
  • Prescreening includes 3 video questions; candidates must answer using their own knowledge and experience, no AI-generated responses
  • Include a link to the candidate''s LinkedIn profile with the submittal

Requirements:

  • Senior experience as a Data Engineer
  • Strong experience with Azure Databricks, Spark, Python
  • Strong SQL skills and database experience
  • Experience monitoring and optimizing Databricks clusters or workflows
  • Experience working with Azure data services and integrating them with Databricks and enterprise data platforms
  • Experience building and optimizing distributed data processing systems (partitions, joins, shuffles, cluster performance)
  • Experience with data pipeline development using tools such as Delta Live Tables (DLT) or Databricks SQL
  • Experience with orchestration, messaging services, or serverless components (e.g., Azure Functions)
  • Experience with version control and CI/CD tools such as GitHub and GitHub Actions
  • Experience using Terraform for cloud infrastructure provisioning
  • Familiarity with SDLC and modern data engineering best practices
  • Strong organizational skills with the ability to manage multiple priorities and work independently

Nice to Have:

  • Experience with data governance, lineage, or cataloging tools (Purview, Unity Catalog)

Responsibilities:

  • Analyze, design, and develop enterprise data solutions using Azure, Databricks, Spark, Python, SQL
  • Develop, optimize, and maintain Spark/PySpark data pipelines, addressing performance issues such as data skew, partitioning, caching, and shuffle optimization
  • Build and support Delta Lake tables and data models for analytical and operational use cases
  • Apply reusable design patterns, data standards, and architectural guidelines, including collaboration with 84.51° when needed
  • Use Terraform to provision and manage cloud and Databricks resources (Infrastructure as Code)
  • Implement and maintain CI/CD workflows using GitHub and GitHub Actions
  • Manage Git-based workflows for Databricks notebooks, jobs, and data engineering artifacts
  • Troubleshoot failures and improve reliability across Databricks jobs, clusters, and data pipelines
  • Apply cloud computing skills to deploy fixes, upgrades, and enhancements in Azure environments
  • Collaborate with engineering teams to enhance tools, systems, development processes, and data security
  • Participate in the development and communication of data strategy, standards, and roadmaps
  • Create architectural diagrams, interface specifications, and design documentation
  • Promote reuse of data assets and contribute to enterprise data catalog practices
  • Provide timely support and communication to stakeholders and end users
  • Mentor team members on data engineering best practices and emerging technologies

This is an excellent opportunity for a highly technical, senior candidate to join data engineering team and work on cutting-edge Azure Databricks and Spark solutions.
Check Qualification

Quick Tip

Customize your resume and cover letter to highlight relevant skills for this position to increase your chances of getting hired.