Bestkaam Logo
Bahwan CyberTek Logo

Data Ops Engineer

Actively Reviewing the Applications

Bahwan CyberTek

Bengaluru Full-Time 4–8 years
Posted 3 days ago Apply by June 11, 2026

Job Description


Role description

POSITION SUMMARY


seeking a highly skilled and innovative DataOps Engineer to join our pharmaceutical R&D team. The DataOps Engineer is the technical backbone of the DataOps support function, responsible for investigating and resolving complex data pipeline failures, performance bottlenecks, and data quality anomalies. This role blends data engineering, software development, and systems administration, ensuring the reliability and efficiency of data platforms. The DataOps Engineer works closely with support and development teams to automate operational tasks, optimize system health, and facilitate seamless transitions of new features into production.


 


POSITION RESPONSIBILITIES 




  • Incident Investigation & Resolution:


    Triage, investigate, and resolve escalated incidents related to data pipelines, platform performance, and data quality.




  • Root Cause Analysis & Permanent Fixes:


    Perform RCA for recurring issues and implement permanent solutions to prevent recurrence.


  • Automation & Operational Efficiency:


    Develop and maintain automation scripts to reduce manual operational tasks and improve reliability.


  • Monitoring & ing Optimization:


    Manage and optimize monitoring, logging, and ing configurations for data platforms.


  • Minor Enhancements & Bug Fixes:


    Handle minor platform enhancements and bug fixes that do not require significant architectural changes.


  • Collaboration & Feature Handoff:


    Collaborate with development pods to ensure smooth handoff of new features into production.


 


 


EDUCATION AND EXPERIENCE


·


·       Bachelor’s degree in Information Technology, Computer Science, or related field.


·       3+ years of experience in data engineering, software development, or systems administration.


·       Hands-on experience with cloud data platforms (Azure Data Factory, Azure Synapse, Databricks).


·       Experience with data pipeline orchestration and scheduling tools.


Familiarity with CI/CD and DevOps principles.


 


TECHNICAL SKILLS REQUIREMENTS


·       


·       Strong proficiency in SQL and a scripting language (Python preferred).


·       Experience with cloud data platforms (Azure, Databricks).


·       Knowledge of data pipeline orchestration and scheduling tools.


·       Familiarity with monitoring, logging, and ing tools.


·       Understanding of CI/CD and DevOps principles.


Excellent troubleshooting and problem-solving skills.


 


 

Check Qualification

Quick Tip

Customize your resume and cover letter to highlight relevant skills for this position to increase your chances of getting hired.