Senior Data Engineer
Actively Reviewing the ApplicationsSquareShift
India, Tamil Nadu, Chennai
Full-Time
INR 10–30 LPA
Posted 17 hours ago
•
Apply by June 29, 2026
Job Description
About the Role
We are looking for a Senior Data Engineer to join our growing data team. In this role, you will be a key contributor to our data infrastructure — designing and building the systems that power analytics, reporting, and business intelligence capabilities. You'll work across modern cloud data platforms, architect robust pipelines, and partner closely with analytics and engineering teams to ensure data is clean, reliable, and ready for consumption.
What You'll Do
Design, build, and maintain scalable data pipelines that ingest, transform, and deliver data across our cloud data ecosystem
Architect and implement data warehouse schemas optimized for performance, scalability, and analytical consumption
Prepare and model data to support BI tools and downstream reporting needs, ensuring data is accurate, well-documented, and easily accessible
Collaborate with data analysts, scientists, and business stakeholders to understand data requirements and translate them into robust engineering solutions
Work across multiple cloud data warehouse platforms as needed, applying best practices for each environment
Contribute to data governance practices including lineage, cataloging, and documentation
Identify and resolve data quality issues proactively
Mentor junior team members and contribute to engineering best practices
Requirements
What We're Looking For
5+ years of experience in data engineering
Hands-on, production-level experience with Snowflake (mandatory) — including schema design, performance optimization, and administration
Strong experience with at least one additional data warehouse platform; Google BigQuery experience is a strong plus
Solid understanding of data warehouse design principles — star schema, dimensional modeling, slowly changing dimensions, and similar patterns
Strong SQL skills — you are comfortable writing complex queries, optimizing for performance, and debugging data issues
Experience building and maintaining data pipelines using modern orchestration and transformation tools (e.g., dbt, Apache Airflow, Spark, or similar)
Proficiency in at least one programming language commonly used in data engineering (Python strongly preferred)
Expert knowledge of one or more relational or analytical databases (e.g., PostgreSQL, Redshift, SQL Server, or similar)
Experience working in cloud-native environments (AWS, GCP, or Azure)
Strong communication skills — ability to clearly explain technical concepts to both technical and non-technical stakeholders, and to collaborate effectively across teams
Benefits
Nice to Have
Experience with Google BigQuery, including working with large-scale datasets and optimizing query costs
Experience designing semantic data models or working within a semantic layer (e.g., LookML, dbt metrics, AtScale)
Hands-on experience with a BI tool such as Looker, Tableau, Power BI, or similar — particularly in structuring data to serve those tools effectively
Familiarity with data lakehouse patterns or platforms (e.g., Delta Lake, Apache Iceberg)
Experience with CI/CD practices applied to data infrastructure
We are looking for a Senior Data Engineer to join our growing data team. In this role, you will be a key contributor to our data infrastructure — designing and building the systems that power analytics, reporting, and business intelligence capabilities. You'll work across modern cloud data platforms, architect robust pipelines, and partner closely with analytics and engineering teams to ensure data is clean, reliable, and ready for consumption.
What You'll Do
Design, build, and maintain scalable data pipelines that ingest, transform, and deliver data across our cloud data ecosystem
Architect and implement data warehouse schemas optimized for performance, scalability, and analytical consumption
Prepare and model data to support BI tools and downstream reporting needs, ensuring data is accurate, well-documented, and easily accessible
Collaborate with data analysts, scientists, and business stakeholders to understand data requirements and translate them into robust engineering solutions
Work across multiple cloud data warehouse platforms as needed, applying best practices for each environment
Contribute to data governance practices including lineage, cataloging, and documentation
Identify and resolve data quality issues proactively
Mentor junior team members and contribute to engineering best practices
Requirements
What We're Looking For
5+ years of experience in data engineering
Hands-on, production-level experience with Snowflake (mandatory) — including schema design, performance optimization, and administration
Strong experience with at least one additional data warehouse platform; Google BigQuery experience is a strong plus
Solid understanding of data warehouse design principles — star schema, dimensional modeling, slowly changing dimensions, and similar patterns
Strong SQL skills — you are comfortable writing complex queries, optimizing for performance, and debugging data issues
Experience building and maintaining data pipelines using modern orchestration and transformation tools (e.g., dbt, Apache Airflow, Spark, or similar)
Proficiency in at least one programming language commonly used in data engineering (Python strongly preferred)
Expert knowledge of one or more relational or analytical databases (e.g., PostgreSQL, Redshift, SQL Server, or similar)
Experience working in cloud-native environments (AWS, GCP, or Azure)
Strong communication skills — ability to clearly explain technical concepts to both technical and non-technical stakeholders, and to collaborate effectively across teams
Benefits
Nice to Have
Experience with Google BigQuery, including working with large-scale datasets and optimizing query costs
Experience designing semantic data models or working within a semantic layer (e.g., LookML, dbt metrics, AtScale)
Hands-on experience with a BI tool such as Looker, Tableau, Power BI, or similar — particularly in structuring data to serve those tools effectively
Familiarity with data lakehouse patterns or platforms (e.g., Delta Lake, Apache Iceberg)
Experience with CI/CD practices applied to data infrastructure
Required Skills
PostgreSQL
Python
SQL Server
SQL
Data Modeling
AWS
Snowflake
Microsoft Azure
Google Cloud Platform
Power BI
Tableau
Looker
Data Governance
Amazon Redshift
CI/CD
Business Intelligence
Apache
Debugging
Cloud native
Apache Airflow
DBT
Performance optimization
Apache Iceberg
Schema design
Data platforms
Star schema
Data warehouse
Data pipelines
Google BigQuery
Lakehouse
Quick Tip
Customize your resume and cover letter to highlight relevant skills for this position to increase your chances of getting hired.
Related Similar Jobs
View All
Principal Software Engineer - Backend
VerbaFlo.AI
India
Full-Time
₹10–14 LPA
PostgreSQL
MongoDB
Python
+17
GCP Infrastructure Engineer - Google Cloud, Terraform, Python, Bash, GKE, CI/CD
UPS
India
Full-Time
₹10–30 LPA
Machine Learning
Python
DevSecOps
+54
Senior Backend Engineer (Python / Go / GenAI)
Uplers
India
Full-Time
Python
LangChain
Cloud Server
Transaction Management – Functional Testing
iAgami
India
Full-Time
Snowflake
Testing
Regression
+3
Senior Data Scientist
Uplers
Lucknow
Full-Time
Vector
RAG
NLP
+1
Share
Quick Apply
Upload your resume to apply for this position