Bestkaam Logo
RAPSYS TECHNOLOGIES PTE LTD Logo

Data Engineer-Snaplogic, Python ,Airflow, Bitbucket

Actively Reviewing the Applications

RAPSYS TECHNOLOGIES PTE LTD

Bengaluru Contract 4–8 years
Posted 5 days ago Apply by June 11, 2026

Job Description

🌟 We're Hiring: Data Engineer! 🌟

We are seeking an experienced Data Engineer with expertise in Snaplogic, Python, Airflow, and Bitbucket to join our dynamic team. The ideal candidate will have a strong background in data integration and pipeline development, along with a passion for building efficient data solutions.

📍 Location: Chennai, India

⏰ Work Mode: Work from anywhere

💼 Role: Data Engineer - Snaplogic, Python, Airflow, Bitbucket

What You'll Do

Detailed Job Description :

Seeking a skilled Data Engineer with hands-on experience in SnapLogic, Python, and Apache Airflow to design, develop, and maintain robust data pipelines and integration workflows. The ideal candidate will work closely with data engineering, analytics, and platform teams to ensure seamless data movement and transformation across systems.

  • Key responsiblities

Key Responsibilities

  • Design and implement data integration workflows using SnapLogic for ETL/ELT processes.
  • Orchestrate and schedule workflows using Apache Airflow, ensuring reliability and scalability.
  • Develop and maintain Python scripts for data transformation, automation, and custom connectors.
  • Collaborate with business and technical teams to gather requirements and deliver optimized solutions.
  • Monitor, troubleshoot, and optimize data pipelines for performance and efficiency.
  • Ensure compliance with security, governance, and data quality standards.
  • Document processes, workflows, and best practices for ongoing support and knowledge sharing.

Required Skills & Qualifications

  • SnapLogic: Strong experience in building pipelines, managing snaps, and integrating with cloud/on-prem systems.
  • Airflow: Expertise in DAG creation, scheduling, and managing dependencies.
  • Python: Proficiency in scripting, data manipulation (Pandas), and API integrations.
  • Good understanding of ETL concepts, data modeling, and integration patterns.
  • Good Understanding of Linux operating system
  • Familiarity with cloud platforms AWS and related services, EC2, EMR, CloudWatch
  • Knowledge of version control bit-bucket and CI/CD practices.
  • Knowledge of Snowflake data lake
  • Strong problem-solving skills and ability to work in a fast-paced environment.
  • Excellent communication and collaboration skills.
  • Ability to manage multiple priorities and deliver within deadlines.

Ready to make an impact? 🚀 Apply now and let's grow together!
Check Qualification

Quick Tip

Customize your resume and cover letter to highlight relevant skills for this position to increase your chances of getting hired.