Lead Applications Developer – GCP, BigQuery, Pub/Sub, Kafka, GKE, Java/Python/c#
Actively Reviewing the ApplicationsUPS
India, Tamil Nadu, Chennai
Full-Time
On-site
Posted 3 weeks ago
•
Apply by May 29, 2026
Job Description
Before you apply to a job, select your language preference from the options available at the top right of this page.
Explore your next opportunity at a Fortune Global 500 organization. Envision innovative possibilities, experience our rewarding culture, and work with talented teams that help you become better every day. We know what it takes to lead UPS into tomorrow—people with a unique combination of skill + passion. If you have the qualities and drive to lead yourself or teams, there are roles ready to cultivate your skills and take you to the next level.
Job Description
Job Summary:
We are seeking a GCP-focused Data Engineer to build scalable, high‑quality data pipelines supporting our Data Maturity initiative for Logistics/Parcel Services. The ideal candidate has strong experience in GCP data services, data modeling, data quality frameworks, and understands logistics domain data such as shipment tracking, routing, and warehouse operations.
Key Responsibilities
Core Engineering (All Levels)
Permanent
UPS is committed to providing a workplace free of discrimination, harassment, and retaliation.
Explore your next opportunity at a Fortune Global 500 organization. Envision innovative possibilities, experience our rewarding culture, and work with talented teams that help you become better every day. We know what it takes to lead UPS into tomorrow—people with a unique combination of skill + passion. If you have the qualities and drive to lead yourself or teams, there are roles ready to cultivate your skills and take you to the next level.
Job Description
Job Summary:
We are seeking a GCP-focused Data Engineer to build scalable, high‑quality data pipelines supporting our Data Maturity initiative for Logistics/Parcel Services. The ideal candidate has strong experience in GCP data services, data modeling, data quality frameworks, and understands logistics domain data such as shipment tracking, routing, and warehouse operations.
Key Responsibilities
Core Engineering (All Levels)
- Pipeline Development: Design and develop scalable ETL/ELT pipelines using BigQuery, Pub/Sub, and Dataflow/Dataproc.
- Microservices: Build and deploy APIs using Python/Java/C# to integrate enterprise and external logistics systems.
- Orchestration: Orchestrate workloads via Composer (Airflow) or GKE using Docker and Kubernetes.
- Data Quality: Implement validation checks, lineage tracking, and monitoring for pipeline SLAs (freshness, latency).
- Modeling: Model logistics and supply chain data in BigQuery for analytics and operational insights.
- DataOps: Apply CI/CD, automated testing, and versioning best practices.
- System Design: Take ownership of end-to-end technical design for complex data modules.
- Mentorship: Actively mentor junior engineers and conduct rigorous code reviews to ensure high engineering standards.
- Best Practices: Establish and document DataOps standards and reusable patterns for the team.
- POD Leadership: Act as the technical head of the data pod, ensuring sprint goals are met and unblocking the team.
- Architecture: Define the high-level architecture and long-term technical roadmap for the logistics data platform.
- Stakeholder Management: Partner with business leaders to translate complex logistics requirements into technical specifications.
- Negotiation: Manage requirements scoping and prioritize backlogs by balancing technical debt with business value.
- Coaching: Drive the professional growth of the entire engineering team through structured coaching and performance feedback.
- Relevant experience –
- Strong hands‑on experience with GCP BigQuery, Pub/Sub, GCS, Dataflow/Dataproc.
- Proficiency in Python/Java/C#, RESTful APIs, and microservice development.
- Experience with Kafka for event-driven ingestion.
- Strong SQL and experience with data modeling
- Expertise in Docker/Kubernetes (GKE) and CI/CD tools (Cloud Build, GitHub Actions, or ADO).
- Experience implementing Data Quality, Metadata management, and Data Governance frameworks.
- Experience with Terraform, Cloud Composer (Airflow)
- Experience in Azure Databricks, Delta Lake, ADLS, and Azure Data Factory
- Experience in Knowledge Graph Engineering using Neo4j and/or Stardog
- Familiarity with Data Governance tools or Cataloging systems (AXON Informatica)
- Logistics domain experience
- Bachelor’s degree in Computer Science, Information Systems, Engineering, or related field
Permanent
UPS is committed to providing a workplace free of discrimination, harassment, and retaliation.
Required Skills
Quick Tip
Customize your resume and cover letter to highlight relevant skills for this position to increase your chances of getting hired.
Related Job Recommendations
View All
Senior Analyst - Data Science
81% matchAmerican Express
India
Full-Time
₹1–2 LPA
Analytics
Science
Senior AI Applications Engineer
89% matchOracle
Hyderabad
Full-Time
Python
TensorFlow
PyTorch
+1
Consultant
75% matchVirtusa
India
Full-Time
₹18–20 LPA
Kubernetes
Grafana
Splunk
+4
React Developer
75% matchArcitech
India
Full-Time
₹4–12 LPA
JavaScript
TypeScript
Redux
+5
Lead Applications Developer
85% matchUPS
India
Full-Time
₹25–26 LPA
Testing
Software development
Coding
Share
Quick Apply
Upload your resume to apply for this position