Software Engineer II- Python, Databricks, AWS, Spark, IDMC
Actively Reviewing the ApplicationsJPMorganChase
Noida
Full-Time
Posted 3 days ago
•
Apply by June 11, 2026
Job Description
Job Description
We have an exciting opportunity for you to advance your data engineering career and make a meaningful impact by joining our innovative team.
Job Summary
As a Data Engineer II at JPMorgan Chase within Corporate Data and Analytics Service team, you design and deliver trusted, scalable data solutions using modern technologies. You collaborate with us to drive critical technology initiatives that support business objectives and foster a culture of growth and inclusion.
Job Responsibilities
We have an exciting opportunity for you to advance your data engineering career and make a meaningful impact by joining our innovative team.
Job Summary
As a Data Engineer II at JPMorgan Chase within Corporate Data and Analytics Service team, you design and deliver trusted, scalable data solutions using modern technologies. You collaborate with us to drive critical technology initiatives that support business objectives and foster a culture of growth and inclusion.
Job Responsibilities
- Design, develop, and maintain scalable data pipelines using Python and Spark
- Build and optimize ETL workflows in Databricks, leveraging Delta Lake features
- Integrate and manage data across AWS services such as S3, Lambda, and EKS
- Collaborate with data analysts and business stakeholders to deliver solutions
- Ensure data quality, integrity, and security across engineering processes
- Monitor, troubleshoot, and optimize pipeline performance and resource usage
- Document data flows, architecture, and processes for internal knowledge sharing
- Formal training or certification on software engineering concepts and 2+ years applied experience
- Proficient in Python for data processing and automation
- Strong experience with Apache Spark (PySpark) for distributed data processing
- Hands-on experience with Databricks platform and Delta Lake
- Solid understanding of AWS cloud services, including S3, Lambda, EKS, and Aurora DB
- Experience with ETL design, data modeling, and data warehousing concepts
- Familiarity with CI/CD tools and practices for data engineering
- Familiarity with modern front-end technologies
- Exposure to cloud technologies
- Experience with orchestration tools such as Airflow
- Experience with REST APIs and data integration
Required Skills
Quick Tip
Customize your resume and cover letter to highlight relevant skills for this position to increase your chances of getting hired.
Related Similar Jobs
View All
Backend & AI Engineer
Dataviv Technologies
India
Full-Time
Prometheus
Grafana
Adobe Illustrator
+7
USI | FY26 | Audit Services | Cloud Engineer - Senior Consultant
Deloitte
India
Full-Time
₹20–44 LPA
Root Cause Analysis
Prometheus
Grafana
+8
Software Implementation Expert (ERP)
Aaradhya Group
15–25 years
Schema design
Capacity Planning
Root Cause Analysis
+4
Senior Copywriter in Mumbai
Pentableu
Delhi
Contract
1–2 years
Red Hat
Event-driven architecture
Schema design
+2
Consultant I
UST
1–2 years
Schema design
Root Cause Analysis
Grafana
+2
Share
Quick Apply
Upload your resume to apply for this position