Bestkaam Logo
Wissen Technology Logo

Data Pipeline Engineer +Airflow

Actively Reviewing the Applications

Wissen Technology

India, Maharashtra Full-Time On-site
Posted 4 hours ago Apply by June 10, 2026

Job Description

Wissen Technology is Hiring for Data Pipeline Engineer +Airflow



About Wissen Technology:

At Wissen Technology, we deliver niche, custom-built products that solve complex business challenges across industries worldwide. Founded in 2015, our core philosophy is built around a strong product engineering mindset—ensuring every solution is architected and delivered right the first time. Today, Wissen Technology has a global footprint with 2000+ employees across offices in the US, UK, UAE, India, and Australia. Our commitment to excellence translates into delivering 2X impact compared to traditional service providers. How do we achieve this? Through a combination of deep domain knowledge, cutting-edge technology expertise, and a relentless focus on quality. We don’t just meet expectations—we exceed them by ensuring faster time-to-market, reduced rework, and greater alignment with client objectives. We have a proven track record of building mission-critical systems across industries, including financial services, healthcare, retail, manufacturing, and more. Wissen stands apart through its unique delivery models. Our outcome-based projects ensure predictable costs and timelines, while our agile pods provide clients with the flexibility to adapt to their evolving business needs. Wissen leverages its thought leadership and technology prowess to drive superior business outcomes. Our success is powered by top-tier talent. Our mission is clear: to be the partner of choice for building world-class custom products that deliver exceptional impact—the first time, every time.


Job Summary: We are looking for a Data Pipeline Engineer to design, build, and operate scalable, reliable data pipelines for enterprise Data platforms. The candidate must have strong working knowledge, and this is a hands-on individual contributor role.


Experience: 5– 10 Years

Location: Pune

Mode of Work: Full time


Key Responsibilities:

  • Build and maintain data transformation pipelines using Dbt/Spark
  • Develop and optimize large-scale/CPU intensive data processing using Apache Spark/Dremio
  • Orchestrate workflows using Airflow and/or Dagster
  • Implement data quality checks, testing, and monitoring for pipelines. Good to have exposer into managing metadata, cataloging, and lineage
  • Support schema evolution, backfills, and incremental processing
  • Ensure pipelines meet SLAs for freshness, reliability, and performance
  • AND/OR (if case of good dremio expertise, all above is good to have/optional)
  • Expertise/working knowledge in Dremio (semantic layer, virtual datasets, Reflections)



Requirements:

  • Strong hands-on experience with
  • dbt
  • Apache Spark
  • Experience with Dremio/Trino or similar lakehouse query engines
  • Airflow and/or Dagster
  • Understanding of data catalogs and lineage (e.g., OpenLineage, DataHub, Apache Polaris , openlineage)
  • Proficiency in Python
  • Experience with Git-based development and CI/CD


Good To Have Skills:

  • OpenTable format/Iceberg, Apache Arrow
  • CDC-based analytics pipelines
  • Cloud platforms (AWS)
  • Kubernetes-based data platforms



Wissen Sites:

Website: www.wissen.com

LinkedIn: https://www.linkedin.com/company/wissen-technology

Wissen Leadership: https://www.wissen.com/company/leadership-team/

Wissen Live: https://www.linkedin.com/company/wissen-technology/posts/feedView=All
Wissen Thought Leadership: https://www.wissen.com/articles/

Check Qualification

Quick Tip

Customize your resume and cover letter to highlight relevant skills for this position to increase your chances of getting hired.