Bestkaam Logo
AALUCKS Talent Pro Logo

Data Engineer (Snowflake, DBT)

Hyderabad, Telangana, India

2 weeks ago

Applicants: 0

Salary Not Disclosed

1 week left to apply

Job Description

Position: Data Engineer (Snowflake, DBT), Hyderabad Department: Information Technology | Role: Full-time | Experience: 3 to 5 Years | Number of Positions: 2 | Location: Hyderabad Skillset: Snowflake, DBT/Matillion, Azure Cloud, Data Factory, Data Bricks, Data Warehousing, SQL, ETL/ELT, SSIS, Cloud Storage, Excellent English communication skills Job Description: About Us We provide companies with innovative technology solutions for everyday business problems. Our passion is to help clients become intelligent, information-driven organizations, where fact-based decision-making is embedded into daily operations, which leads to better processes and outcomes. Our team combines strategic consulting services with growth-enabling technologies to evaluate risk, manage data, and leverage AI and automated processes more effectively. With deep, big four consulting experience in business transformation and efficient processes, we are a game-changer in any operations strategy. We are looking for an experienced and results-driven Data Engineer to join our growing Data Engineering team.?The ideal candidate will be proficient in building scalable, high-performance data transformation pipelines using Snowflake and DBT or Matillion and be able to effectively work in a consulting setup.?In this role, you will be instrumental in ingesting, transforming, and delivering high-quality data to enable data-driven decision-making across the client?s organization.? Key Responsibilities: 1. Design and implement scalable ELT pipelines using dbt on Snowflake, following industry accepted best practices. 2. Build ingestion pipelines from various sources including relational databases, APIs, cloud storage and flat files into Snowflake. 3. Implement data modelling and transformation logic to support layered architecture (e.g., staging, intermediate, and mart layers or medallion architecture) to enable reliable and reusable data assets. 4. Leverage orchestration tools (e.g., Airflow,dbt Cloud, or Azure Data Factory) to schedule and monitor data workflows. 5. Apply dbt best practices: modular SQL development, testing, documentation, and version control. 6. Perform performance optimizations in dbt/Snowflake through clustering, query profiling, materialization, partitioning, and efficient SQL design. 7. Apply CI/CD and Git-based workflows for version-controlled deployments. 8. Contribute to growing internal knowledge base of dbt macros, conventions, and testing frameworks. 9. Collaborate with multiple stakeholders such as data analysts, data scientists, and data architects to understand requirements and deliver clean, validated datasets. 10. Write well-documented, maintainable code using Git for version control and CI/CD processes. 11. Participate in Agile ceremonies including sprint planning, stand-ups, and retrospectives. 12. Support consulting engagements through clear documentation, demos, and delivery of client-ready solutions. Required Qualifications: ??3 to 5 years of experience in data engineering roles, with 2+ years of hands-on experience in Snowflake and DBT or Matillion?(Matillion-DPC is highly preferred, not mandatory ? Experience building and deploying DBT models in a production environment. ? Expert-level SQL and strong understanding of ELT principles. Strong understanding of ELT patterns and data modelling (Kimball/Dimensional preferred). ??Familiarity with data quality and validation techniques: dbt tests, dbt docs etc. ? Experience with Git, CI/CD, and deployment workflows in a team setting ? Familiarity with orchestrating workflows using tools like dbt Cloud, Airflow, or Azure Data Factory. Core Competencies: o Data Engineering and ELT Development: ? Building robust and modular data pipelines using dbt. ? Writing efficient SQL for data transformation and performance tuning in Snowflake. ? Managing environments, sources, and deployment pipelines in dbt. o Cloud Data Platform Expertise: ? Strong proficiency with Snowflake: warehouse sizing, query profiling, data loading, and performance optimization. ? Experience working with cloud storage (Azure Data Lake, AWS S3, or GCS) for ingestion and external stages. Technical Toolset: o Languages & Frameworks: ? Python: For data transformation, notebook development, automation. ? SQL: Strong grasp of SQL for querying and performance tuning. Best Practices and Standards: o Knowledge of modern data architecture concepts including layered architecture (e.g., staging ? intermediate ? marts, Matillion architecture). Familiarity with data quality, unit testing (dbt tests), and documentation (dbt docs). Security & Governance: o Access and Permissions: ? Understanding of access control within Snowflake (RBAC), role hierarchies, and secure data handling. ? Familiar with?data privacy policies (GDPR basics), encryption at rest/in transit. Deployment & Monitoring: o DevOps and Automation: ? Version control using Git, experience with CI/CD practices in a data context. ? Monitoring and logging of pipeline executions, alerting on failures. Soft Skills: o Communication & Collaboration: ? Ability to present solutions and handle client demos/discussions. ? Work closely with onshore and offshore team of analysts, data scientists, and architects. ? Ability to document pipelines and transformations clearly. ? Basic Agile/Scrum familiarity ? working in sprints and logging tasks. ? Comfort with ambiguity, competing priorities and fast-changing client environment. Education: o Bachelor?s or master?s degree in computer science, Data Engineering, or a related field. o Certifications such as Snowflake SnowPro, dbt Certified Developer Data Engineering are a plus. Please note the mandatory or most preferred skill set for this role ? Must have experience in Snowflake ? Must have experience in DBT or Matillion (Matillion-DPC is highly preferred) ? Must have experience in SSIS Additional Information: Why Join Us? ? Opportunity to work on diverse and challenging projects in a consulting environment. ? Collaborative work culture that values innovation and curiosity. ? Access to cutting-edge technologies and a focus on professional development. ? Competitive compensation and benefits package. ? Be part of a dynamic team delivering impactful data solutions.? ? This is 5 days work from office role in Hyderabad ? There are 2 rounds of interview in the process. Required Qualification: Bachelor of Engineering - Bachelor of Technology (B.E./B.Tech.)?- IT/CS/E&CE/MCA With a fast-growing analytics, business intelligence, IT Products and automation company

Additional Information

Company Name
AALUCKS Talent Pro
Industry
N/A
Department
N/A
Role Category
Machine Learning Engineer
Job Role
Mid-Senior level
Education
No Restriction
Job Types
On-site
Employment Types
Full-Time
Gender
No Restriction
Notice Period
Immediate Joiner
Year of Experience
1 - Any Yrs
Job Posted On
2 weeks ago
Application Ends
1 week left to apply

Similar Jobs

Morgan Stanley

1 week ago

Java Developer_Assocate_Bangalore

Morgan Stanley

SQL, Scala, Linux +2
Brillio

2 months ago

GenAI & Databricks- R01556764

Brillio

Tradelab Technologies

2 months ago

Tradelab Technologies - Python Developer - Trading & Fintech Domain

Tradelab Technologies

Python, SQL, C++ +1
Argano

2 weeks ago

Solutions Architect - SAP FICO Public Cloud

Argano

Quantiphi

2 months ago

Senior Data Engineer

Quantiphi

Uplers

1 week ago

Snowflake -Senior Data engineer

Uplers

EduRun Group

2 months ago

Sr. QA Automation Engineer(Python & Pytest)

EduRun Group

CGI

2 weeks ago

Software Engineer with 3-5 years of Experience

CGI

Capgemini

2 weeks ago

Nexthink Architect

Capgemini

HARMAN India

2 months ago

Application Developer

HARMAN India

C#, SQL, Git +1