Bestkaam Logo
CloudHire Logo

Senior Foundry/PySpark Engineer

Actively Reviewing the Applications

CloudHire

India, Telangana, Hyderabad Full-Time On-site
Posted 15 hours ago Apply by June 10, 2026

Job Description

Position Overview

We are seeking a highly skilled Senior PySpark Engineer with strong experience in distributed data processing and Snowflake. The engineer will join the MediCom Support Unit for a major global pharmaceutical client and will primarily handle Tier 1 support, pipeline monitoring, troubleshooting, and minor development tasks. This role requires someone with strong analytical skills, good communication, and the ability to support a large-scale data engineering ecosystem. Experience in Palantir Foundry is a nice-to-have, but not mandatory.


About Indigrators

Indigrators is a trusted technology partner delivering end-to-end solutions across software product development, enterprise applications, and global capability centers. We specialize in building scalable, high-performance systems and providing consulting, implementation, and support for enterprise solutions. Our GCC-as-a-Service model enables organizations to establish and operate dedicated technology and shared services hubs in India with agility and cost efficiency. With a strong focus on innovation and operational excellence, Indigrators combines deep domain expertise with advanced technologies to help businesses accelerate digital transformation. Our culture emphasizes transparency, collaboration, and empowering teams to deliver impactful results for clients worldwide.


Required Skills

• 5+ years of hands-on experience with PySpark

• Strong knowledge of distributed data processing

• Experience with Snowflake (querying data modeling basics troubleshooting)

• Strong SQL skills

• Experience in data pipelines ETL/ELT processes

• Familiarity with Git/GitHub or similar version control tools


Key Responsibilities

• Provide Tier 1 support for PySpark-based data pipelines

• Diagnose and resolve data pipeline failures and transformation issues

• Perform minor development tasks, enhancements, and bug fixes

• Execute SQL/Snowflake queries for validation, monitoring, and debugging

• Collaborate with global teams and participate in daily stand-ups

• Assist in onboarding and knowledge transfer sessions with the existing 15-member development team

• Maintain runbooks, documentation, and operational procedures

• Ensure SLAs are consistently met for incident and request handling 


Qualifications

Any graduate


Technical Requirements

PySpark, distributed data processing, Snowflake, SQL, data pipelines, ETL, ELT, Git, Git Hub


Check Qualification

Quick Tip

Customize your resume and cover letter to highlight relevant skills for this position to increase your chances of getting hired.