Data Engineer - Snowflake DBT
Hyderabad, Telangana, India
3 days ago
Applicants: 0
Share
3 weeks left to apply
Job Description
Experience: 3 to 5 Years Required Qualifications 3 to 5 years of experience in data engineering roles, with 2+ years of hands-on experience in Snowflake and DBT or Matillion (Matillion-DPC is highly preferred, not mandatory Experience building and deploying DBT models in a production environment. Expert-level SQL and strong understanding of ELT principles. Strong understanding of ELT patterns and data modelling (Kimball/Dimensional preferred). Familiarity with data quality and validation techniques: dbt tests, dbt docs etc. Experience with Git, CI/CD, and deployment workflows in a team setting Familiarity with orchestrating workflows using tools like dbt Cloud, Airflow, or Azure Data Factory. Job Description We are looking for an experienced and results-driven Data Engineer to join our growing Data Engineering team. The ideal candidate will be proficient in building scalable, high-performance data transformation pipelines using Snowflake and dbt or Matillion and be able to effectively work in a consulting setup. In this role, you will be instrumental in ingesting, transforming, and delivering high-quality data to enable data-driven decision-making across the client?s organization. Key Responsibilities Design and implement scalable ELT pipelines using dbt on Snowflake, following industry best practices. Build reliable ingestion pipelines from relational databases, APIs, cloud storage, and flat files into Snowflake. Develop data models and transformation logic to support layered architectures (staging, intermediate, marts) or medallion patterns. Use orchestration tools (Airflow, dbt Cloud, Azure Data Factory) to schedule, monitor, and recover workflows. Apply dbt best practices: modular SQL development, automated testing, documentation, and version control. Optimize performance in dbt and Snowflake through clustering, materializations, partitioning, and query tuning. Implement CI/CD and Git-based workflows to automate deployments and ensure traceability. Contribute to the internal knowledge base with dbt macros, conventions, and testing frameworks. Collaborate with data analysts, data scientists, and architects to translate requirements and deliver validated datasets. Write clean, maintainable, and well-documented code; ensure artifacts are client-ready. Participate in Agile ceremonies (sprint planning, stand-ups, retrospectives) to support delivery. Support consulting engagements with clear documentation, demos, and client-focused solutions. Skills: dbt,snowflake,data engineering,airflow,sql
Required Skills
Additional Information
- Company Name
- YO HR Consultancy
- Industry
- N/A
- Department
- N/A
- Role Category
- Machine Learning Engineer
- Job Role
- Mid-Senior level
- Education
- No Restriction
- Job Types
- On-site
- Gender
- No Restriction
- Notice Period
- Less Than 30 Days
- Year of Experience
- 1 - Any Yrs
- Job Posted On
- 3 days ago
- Application Ends
- 3 weeks left to apply
Similar Jobs
Quick Apply
Upload your resume to apply for this position