Bestkaam Logo
Uplers Logo

Senior Data Engineer

India, Ranchi

2 weeks ago

Applicants: 0

Salary Not Disclosed

1 week left to apply

Job Description

Experience : 5.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - 1digitalstack.ai) What do you need for this opportunity? Must have skills required: Python, Java, Iceberg, Kafka, Apache Beam, Apache Flink, Apache pulsar, Spark, Trino, OLAP, ClickHouse, starrocks 1digitalstack.ai is Looking for: Role - Senior Data Engineer Experience - 5-7 Years Location - Remote (India) About 1DigitalStack.ai 1DigitalStack.ai combines AI and deep eCommerce data to help global brands grow faster on online marketplaces. Our platforms deliver advanced analytics, actionable intelligence, and media automation — enabling brands to optimize visibility, efficiency, and sales performance at scale. We partner with India’s top consumer companies — Unilever, Marico, Coca-Cola, Tata Consumer, Dabur, and Unicharm — across 125+ marketplaces globally. Backed by leading venture investors and powered by a 220+ member team, we’re in our $5–10M growth journey, scaling rapidly across categories and geographies to redefine how brands win on digital shelves. 🔗 Check out more at www.1digitalstack.ai About Role This is a high-impact, hands-on engineering role owning the core data systems that power our analytics, AI, and automation stack. You’ll work closely with the CTO and Engineering Leads and independently manage large, high-throughput data pipelines that process millions of events. Responsibilities : Build and maintain high-throughput, real-time data pipelines using Kafka/Pulsar with Spark, Flink, and distributed compute engines. Design fault-tolerant systems with zero-data-loss principles — checkpointing, replay logic, DLQs, deduplication, and back-pressure handling. Implement data observability — quality checks, SLA alerts, anomaly detection, lineage, and metadata insights. Design and manage Iceberg-based lakehouse tables (Polaris/Gravitino catalogs, schema evolution, compaction). Build fast OLAP layers using ClickHouse / StarRocks. Model data across bronze → silver → gold layers for downstream teams. Migrate and modernize legacy pipelines into scalable, distributed workflows. Orchestrate ETL workloads using Airflow, DBT, Dagster, SQLMesh. Optimize SQL transformations and distributed execution across Trino/Spark. Ensure strict security and governance across all data layers — access control, encryption, auditability. Collaborate with backend, analytics, and platform teams for seamless data delivery. Requirements Core Technical Skills Extremely strong SQL — window functions, query planning, optimization. High comfort working with distributed & parallel workloads. Hands-on experience with some-many of these technologies : Apache Spark, Apache Flink, Trino, Apache Kafka, Apache Pulsar, Apache Beam Advanced experience in Python (preferred) or Java (strong fundamentals). Strong understanding of Parquet, Apache Iceberg, and Iceberg REST catalogs (Polaris / Gravitino). Experience with OLAP databases — ClickHouse / StarRocks. Experience with semantic layers — Cube.js or similar. Strong experience building pipelines with Airflow, DBT, Dagster, SQLMesh. Foundational Strengths Solid understanding of data structures & algorithms — sorting, searching, memory models. Strong grasp of OLTP vs OLAP, indexing, query execution, and storage formats. Ability to debug distributed systems end-to-end (compute, storage, network, orchestration). Familiarity with cloud environments, containerization (Docker), and monitoring. Experience with large-scale data — high throughput, billions of rows, large parallel workloads. Awareness of cost optimization in compute & storage. Good to Have Experience with emerging stream processors — Dagster, RisingWave, Arroyo. Kubernetes, Terraform, or cloud-native big-data stacks. Mindset Strong ownership — takes systems from design → build → monitor. Self-driven, independent, and comfortable making technical decisions. High attention to reliability, data accuracy, and operational excellence. Naturally grows into broader technical responsibility as the platform scales. Why 1DS is a great choice High-trust, no-politics culture — we value communication, ownership, and accountability Collaborative, ego-free team — building together is in our DNA Learning-first environment — mentorship, peer reviews, and exposure to real business impact Modern stack + autonomy — your voice shapes how we build VC-funded & scaling fast — 250+ strong, building from India for the world How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Additional Information

Company Name
Uplers
Industry
N/A
Department
N/A
Role Category
Data Engineer
Job Role
Mid-Senior level
Education
No Restriction
Job Types
Remote
Gender
No Restriction
Notice Period
Immediate Joiner
Year of Experience
1 - Any Yrs
Job Posted On
2 weeks ago
Application Ends
1 week left to apply

Similar Jobs

i2V Systems

2 weeks ago

i2V Systems - Computer Vision Developer - Image Processing/Machine Learning

i2V Systems

HSBC

2 weeks ago

Selenium Automation Testing/Consultant Specialist

HSBC

Turing

2 weeks ago

Remote Sr Software Developer - Python

Turing

NCR Atleos

2 months ago

DevOps Engineer II

NCR Atleos

EPAM Systems

2 months ago

Cloud AIOps Architect

EPAM Systems

Sprinklr

2 weeks ago

Lead Software Engineer

Sprinklr

Zeloite

2 weeks ago

Software Development Engineer (SDE) Intern ? Bangalore (On-site)

Zeloite

Uplers

2 weeks ago

Python Backend Engineer

Uplers

Uplers

2 months ago

Python Backend Engineer

Uplers

Visteon Corporation

2 weeks ago

Generative AI Developer

Visteon Corporation