Data Engineer-Data Platforms-Google
Actively Reviewing the ApplicationsIBM
India, Haryana, Gurgaon
Full-Time
On-site
Posted 12 hours ago
•
Apply by June 15, 2026
Job Description
Introduction
A career in IBM Consulting is built on long-term client relationships and close collaboration worldwide. You’ll work with leading companies across industries, helping them shape their hybrid cloud and AI journeys. With support from our strategic partners, robust IBM technology, and Red Hat, you’ll have the tools to drive meaningful change and accelerate client impact. At IBM Consulting, curiosity fuels success. You’ll be encouraged to challenge the norm, explore new ideas, and create innovative solutions that deliver real results. Our culture of growth and empathy focuses on your long-term career development while valuing your unique skills and experiences.
Your Role And Responsibilities
As a Data Engineer specializing in Google's data platforms, you will design, build, and maintain data engineering solutions on Google's Cloud ecosystem. This role requires expertise in utilizing various Google services for batch and real-time data pipelines, data migration, and data layer design. Your primary responsibilities will include:
A career in IBM Consulting is built on long-term client relationships and close collaboration worldwide. You’ll work with leading companies across industries, helping them shape their hybrid cloud and AI journeys. With support from our strategic partners, robust IBM technology, and Red Hat, you’ll have the tools to drive meaningful change and accelerate client impact. At IBM Consulting, curiosity fuels success. You’ll be encouraged to challenge the norm, explore new ideas, and create innovative solutions that deliver real results. Our culture of growth and empathy focuses on your long-term career development while valuing your unique skills and experiences.
Your Role And Responsibilities
As a Data Engineer specializing in Google's data platforms, you will design, build, and maintain data engineering solutions on Google's Cloud ecosystem. This role requires expertise in utilizing various Google services for batch and real-time data pipelines, data migration, and data layer design. Your primary responsibilities will include:
- Design Data Pipelines: Design and develop batch and real-time data pipelines for Data Warehouse and Datalake using Google services such as DataProc, DataFlow, PubSub, BigQuery, and Big Table.
- Develop Data Engineering Solutions: Utilize Google Cloud Storage, BigTable, BigQuery DataProc with Spark and Hadoop, and Google DataFlow with Apache Beam or Python to build and maintain data engineering solutions.
- Manage Data Platforms: Schedule and manage the data platform using Google Cloud Scheduler and Cloud Composer (Airflow), ensuring efficient data pipeline operations.
- Implement Data Migration: Develop and implement data migration solutions using Google services, ensuring seamless data transfer between systems.
- Optimize Data Layer: Design and optimize the data layer using Google services such as BigQuery, Big Table, and Cloud Spanner, ensuring efficient data storage and retrieval.
- Google Cloud Ecosystem Expertise: Exposure to designing, building, and maintaining data engineering solutions on Google's Cloud ecosystem, including services such as Google DataProc, DataFlow, PubSub, BigQuery, Big Table, Cloud Spanner, CloudSQL, and AlloyDB.
- Data Pipeline Development Experience: Exposure to developing and managing batch and real-time data pipelines for Data Warehouse and Datalake using Google services and open-source technologies like Apache Airflow, dbt, Spark/Python, or Spark/Scala.
- Google Cloud Services Proficiency: Experience working with Google Cloud Storage, BigTable, BigQuery DataProc with Spark and Hadoop, and Google DataFlow with Apache Beam or Python to build and maintain data engineering solutions.
- Data Platform Management Knowledge: Exposure to scheduling and managing the data platform using Google Cloud Scheduler and Cloud Composer (Airflow) for efficient data pipeline operations.
- Data Layer Design Understanding: Experience working with data layer design using Google services such as BigQuery, Big Table, and Cloud Spanner for efficient data storage and retrieval.
- Open Source Technologies: Exposure to utilizing open-source technologies like Apache Airflow, dbt, Spark/Python, or Spark/Scala for developing and managing batch and real-time data pipelines.
- Data Migration Solutions: Experience working with Google services to develop and implement data migration solutions, ensuring seamless data transfer between systems.
- Cloud Composer Expertise: Exposure to using Cloud Composer (Airflow) for scheduling and managing the data platform, ensuring efficient data pipeline operations.
Required Skills
Engineering
Python
Scheduling
Scala
BigQuery
Hadoop
Spark
Apache Beam
Airflow
BigTable
Data Engineering
Apache
Data platform
Cloud services
Apache Airflow
DBT
Career development
Data layer
Data migration
Consulting
Cloud Storage
Hybrid Cloud
Red Hat
Google Cloud Storage
Data platforms
Platform management
Migration
Data warehouse
Pipeline development
Data pipelines
Cloud Scheduler
Introduction
Cloud Spanner
Data transfer
Retrieval
Composer
Dataproc
Google Dataflow
Storage
Spanner
Batch
Table
Client Relationships
Fuels
Data Pipeline
Cloud Composer
Transfer
Open Source Technologies
Data storage
Quick Tip
Customize your resume and cover letter to highlight relevant skills for this position to increase your chances of getting hired.
Related Similar Jobs
View All
Trainee Structural Engineer
Fusie Engineers
India
Full-Time
₹7–14 LPA
Communication
Engineering
Documentation
+31
Data Analyst - KPO
Virtusa
India
Full-Time
₹1–1 LPA
Python
MS Office
Analytics
+1
BUSINESS ANALYST L3
Wipro
India
Full-Time
Sales
GenAI Developer
Viraaj HR Solutions Private Limited
India
Full-Time
₹8–40 LPA
Python
Docker
Kubernetes
+3
Mobile App Developer
Papigen
India
Full-Time
₹1–4 LPA
Engineering
Data Integration
Data Science
+5
Share
Quick Apply
Upload your resume to apply for this position