AWS Data Engineer
Actively Reviewing the ApplicationsZorba AI
India, Karnataka, Bengaluru
Full-Time
On-site
INR 15–30 LPA
Posted 5 hours ago
•
Apply by June 8, 2026
Job Description
Job Summary
We are seeking an experienced AWS Data Engineer with 6+ years of experience in building scalable data pipelines, designing data architectures, and working with cloud-based data platforms. The ideal candidate will have strong expertise in AWS services, ETL development, data warehousing, and big data technologies to support advanced analytics and business intelligence initiatives.
Key Responsibilities
We are seeking an experienced AWS Data Engineer with 6+ years of experience in building scalable data pipelines, designing data architectures, and working with cloud-based data platforms. The ideal candidate will have strong expertise in AWS services, ETL development, data warehousing, and big data technologies to support advanced analytics and business intelligence initiatives.
Key Responsibilities
- Design, develop, and maintain scalable data pipelines using AWS cloud services.
- Build and optimize ETL/ELT processes to ingest data from multiple sources.
- Develop and manage data lakes and data warehouses on AWS.
- Work with large-scale datasets and ensure data quality, reliability, and performance.
- Implement data transformation and integration solutions using tools such as AWS Glue or Apache Spark.
- Collaborate with data scientists, analysts, and business teams to deliver high-quality datasets.
- Optimize queries and improve performance in Amazon Redshift, Athena, or other AWS data services.
- Implement data governance, security, and compliance standards.
- Monitor and troubleshoot data pipelines and workflows.
- Automate data processes using Python, SQL, and scripting tools.
- 6+ years of experience in Data Engineering or Big Data development.
- Strong experience with AWS services, such as:
- Amazon S3
- AWS Glue
- Amazon Redshift
- AWS Lambda
- Amazon EMR
- Amazon Athena
- AWS Step Functions
- Strong SQL and Python programming skills.
- Experience building ETL pipelines and data workflows.
- Knowledge of data modeling, data warehousing, and big data concepts.
- Experience with Apache Spark / PySpark.
- Familiarity with workflow orchestration tools like Apache Airflow.
- Understanding of CI/CD and DevOps practices.
Required Skills
Engineering
Compliance
Python
Apache Spark
SQL
Data Modeling
AWS
Spark
Data Warehousing
Data Governance
Amazon Redshift
Amazon S3
Airflow
ETL
Warehousing
Data Engineering
DevOps
CI/CD
Lambda
Apache
Cloud services
Data quality
Governance
EMR
Apache Airflow
Athena
ETL Development
AWS Lambda
Scripting
Orchestration
Data processes
Modeling
Big Data
Data services
Data transformation
Data pipelines
AWS Cloud
PySpark
Data Development
Amazon Athena
Data concepts
AWS glue
Workflow orchestration
Big Data Technologies
AWS Services
ETL Pipelines
ELT
RedShift
Glue
Big data development
Step Functions
AWS Step Functions
AWS Cloud Services
Quick Tip
Customize your resume and cover letter to highlight relevant skills for this position to increase your chances of getting hired.
Related Similar Jobs
View All
Business Development Manager
Ozone Group
India
Full-Time
₹70–80 LPA
Communication
Sales
Engineering
+37
Service Desk Analyst
Gotham Technology Group
India
Contract
Communication
Customer Service
Network Troubleshooting
+45
Senior Accounts Executive
Alike
India
Full-Time
Communication
Financial Statements
Financial Analysis
+29
Care Engineer
Nokia
India
Full-Time
MySQL
PostgreSQL
Python
+9
VP Procurement Services
Rolls-Royce
India
Full-Time
Risk Management
Data Governance
Analytics
Share
Quick Apply
Upload your resume to apply for this position