AWS, Python, Pyspark, EMR ,Apache Airflow- Data Platform Engineer -Remote
Pune, Maharashtra, India
1 month ago
Applicants: 0
N/A
Job Description
?? We're Hiring: Data Platform Engineer! ?? We are seeking an experienced Data Platform Engineer to design, build, and maintain scalable data infrastructure using AWS cloud services. The ideal candidate will have expertise in Python, PySpark, EMR, and Apache Airflow to develop robust data pipelines and analytics solutions that drive business insights. ?? Location: Pune, India ? Work Mode: Work from anywhere ?? Role: AWS, Python, Pyspark, EMR, Apache Airflow- Data Platform Engineer What You'll Do ?? Design and implement scalable data pipelines using Apache Airflow ?? Build and optimize AWS EMR clusters for big data processing ?? Develop data processing applications using Python and PySpark ?? Create ETL workflows for data ingestion and transformation ?? Monitor and troubleshoot data platform performance ?? Collaborate with data scientists and analysts on data requirements What We're Looking For ? 6+ years of experience in data engineering ? Strong expertise in AWS services (EMR, S3, Glue, Lambda) ? Proficiency in Python and PySpark for big data processing ? Hands-on experience with Apache Airflow for workflow orchestration ? Knowledge of data warehousing and ETL best practices ? Experience with SQL and NoSQL databases Ready to make an impact? ?? Apply now and let's grow together!
Required Skills
Additional Information
- Company Name
- RAPSYS TECHNOLOGIES PTE LTD
- Industry
- N/A
- Department
- N/A
- Role Category
- DevOps Engineer
- Job Role
- Mid-Senior level
- Education
- No Restriction
- Job Types
- Hybrid
- Gender
- No Restriction
- Notice Period
- Less Than 30 Days
- Year of Experience
- 1 - Any Yrs
- Job Posted On
- 1 month ago
- Application Ends
- N/A