Senior Data Engineer
Actively Reviewing the ApplicationsEgen
Hyderabad, Telangana, India
Full-Time
Remote
Posted 4 months ago
•
Apply by May 5, 2026
Job Description
Senior Data Engineer ? GCP
Job Overview:
We are looking for a skilled and motivated Senior Data Engineer with strong experience in Python programming and Google Cloud Platform (GCP) to join our data engineering team. The ideal candidate will be responsible for designing, developing, and maintaining robust and scalable ETL (Extract, Transform, Load) data pipelines. The role involves working with various GCP services, implementing data ingestion and transformation logic, and ensuring data quality and consistency across systems.
Experience Level:
4 to 8 years of relevant IT experience
Key Responsibilities:
Design, develop, test, and maintain scalable ETL data pipelines using Python.
Work extensively on Google Cloud Platform (GCP) services such as:
Dataflow for real-time and batch data processing
Cloud Functions for lightweight serverless compute
BigQuery for data warehousing and analytics
Cloud Composer for orchestration of data workflows (based on Apache Airflow)
Google Cloud Storage (GCS) for managing data at scale
IAM for access control and security
Cloud Run for containerized applications
Should have experience in the following areas :
API framework: Python FastAPI
Processing engine: Apache Spark
Messaging and streaming data processing : Kafka
Storage: MongoDB, Redis/Bigtable
Orchestration: Airflow
Perform data ingestion from various sources and apply transformation and cleansing logic to ensure high-quality data delivery.
Implement and enforce data quality checks, validation rules, and monitoring.
Collaborate with data scientists, analysts, and other engineering teams to understand data needs and deliver efficient data solutions.
Manage version control using GitHub and participate in CI/CD pipeline deployments for data projects.
Write complex SQL queries for data extraction and validation from relational databases such as SQL Server, Oracle, or PostgreSQL.
Document pipeline designs, data flow diagrams, and operational support procedures.
Required Skills:
4-8 years of hands-on experience in Python for backend or data engineering projects.
Strong understanding and working experience with GCP cloud services (especially Dataflow, BigQuery, Cloud Functions, Cloud Composer, etc.).
Solid understanding of data pipeline architecture, data integration, and transformation techniques.
Experience in working with version control systems like GitHub and knowledge of CI/CD practices.
Experience in Apache Spark, Kafka, Redis, Fast APIs, Airflow, GCP Composer DAGs.
Strong experience in SQL with at least one enterprise database (SQL Server, Oracle, PostgreSQL, etc.).
Experience in data migrations from on-premise data sources to Cloud platforms.
Good to Have (Optional Skills):
Experience working with Snowflake cloud data platform.
Experience in deployments in GKE, Cloud Run.
Hands-on knowledge of Databricks for big data processing and analytics.
Familiarity with Azure Data Factory (ADF) and other Azure data engineering tools.
Additional Details:
Excellent problem-solving and analytical skills.
Strong communication skills and ability to collaborate in a team environment.
Required Skills
Quick Tip
Customize your resume and cover letter to highlight relevant skills for this position to increase your chances of getting hired.
Related Similar Jobs
View All
Data Engineer
loanDNA
India
Full-Time
Python
Cloud Platforms
Data Modeling
+18
Senior Technical Project Manager
Uplers
India
Full-Time
Engineering
Git
Python
+6
Senior QA Engineer
Virtusa
Andhra Pradesh
Full-Time
MS Dynamics Lead
Wipro
Mumbai
Full-Time
JavaScript
MERN Stack Teaching (Internship)
NeeuvNext
Kadapa
Full-Time
HTML
CSS
JavaScript
+2
Share
Quick Apply
Upload your resume to apply for this position