Bestkaam Logo
Princeton IT Services, Inc Logo

Cloud Data Engineer

Actively Reviewing the Applications

Princeton IT Services, Inc

India, Andhra Pradesh, Gudivada Full-Time On-site
Posted 2 days ago Apply by May 20, 2026

Job Description

Job Title: – Cloud Data Engineer

Work Location: Gudivada, Andhra Pradesh

Working Hours: Rotational UK & US shifts (EST hours) Must be flexible to work night shifts based on project

Website – https://princetonits.com

About Us: We are a leading technology company committed to delivering innovative software solutions. We are seeking a highly skilled Senior Software Developer to join our team and contribute to the development of cutting-edge applications.

Job Description

Cloud Data Engineer

Role Overview

We are looking for a hands-on Data Engineer who can quickly understand and take ownership of our AWS + Databricks data platform. This role requires strong technical execution skills in building and managing data pipelines, collaborating with product and business teams, and supporting platform operations and executing data products.

The ideal candidate is a fast learner, execution-focused, and comfortable working across technical and functional stakeholders.

Key Responsibilities

  • Own and manage the AWS-based data platform, including services such as S3, Athena, Glue, Redshift, EMR, and related components.
  • Design, develop, and maintain scalable data pipelines and workflows using Databricks.
  • Contribute to data platform architecture, including framework enhancements, upgrades, and performance improvements.
  • Design, implement, and manage Delta Lake architecture to support reliable, scalable, and high-performance data processing.
  • Monitor, troubleshoot, and optimize data pipelines, Spark jobs, and overall platform stability and performance.
  • Collaborate with Technical Product Owners, Data Product Owners, and Business stakeholders to translate functional requirements into robust, scalable technical solutions.
  • Lead and support data migration, transformation, and modernization initiatives across the platform.
  • Partner with Architecture and DevOps teams to implement solutions using CI/CD pipelines, infrastructure automation, and best deployment practices.
  • Ensure high standards of data quality, governance, reliability, and cost optimization across all delivered data products.

Required Skills & Experience

  • Strong hands-on experience with Databricks
  • Solid experience with AWS data services (S3, Athena,Glue, EMR, Redshift, RDS, etc.)
  • Experience building and managing Data Lakes and ETL pipelines
  • Proficiency in Python
  • Experience implementing Delta Lake
  • Ability to independently execute framework changes
  • Work seamlessly in delivering data products.
  • Strong problem-solving and communication skills

Good to Have

  • Knowledge of SAP, Oracle, or HANA
  • Experience working with enterprise/finance data platforms
  • Exposure to DevOps practices and Infrastructure as Code
  • AWS or Databricks certifications
Check Qualification

Quick Tip

Customize your resume and cover letter to highlight relevant skills for this position to increase your chances of getting hired.