Sr. Technical Consultant - Cloud - Snowflake
Actively Reviewing the ApplicationsBlue Yonder
Mumbai
Full-Time
4–8 years
Posted 2 days ago
•
Apply by June 11, 2026
Job Description
Scope
If you want to know the heart of a company, take a look at their values. Ours unite us. They are what drive our success – and the success of our customers. Does your heart beat like ours? Find out here: Core Values
All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status.
- The Data Platform team partners with business and engineering groups to deliver scalable Snowflake solutions.
- The L4 role focuses heavily on execution—translating architectural patterns into highly optimized, secure, and reliable data pipelines and models for specific business domains.
- The position involves automating ingestion, optimizing complex queries, and ensuring platform stability while guiding junior data engineers.
- Snowflake Platform: Multi-Cluster Warehouses, Snowpipe, Tasks, Streams, Zero-Copy Cloning, Time Travel, Apache Iceberg tables.
- Data Engineering & Scripting: Advanced SQL, Python (Snowpark), Java/Scala (UDFs/UDTFs), dbt (Data Build Tool).
- Integrations & Orchestration: Apache Airflow, Fivetran, Kafka, Spark, Trino, external catalogs (AWS Glue, Polaris).
- Governance & Security: Hierarchical RBAC, Dynamic Data Masking, Row Access Policies, Object Tagging, Secure Data Sharing.
- Platform Enhancements: Snowpark Container Services, Snowflake Cortex (AI/ML), Search Optimization Service, Materialized Views.
- DataOps/Agile: CI/CD pipelines, Git, GitHub Actions/GitLab, Terraform (Infrastructure-as-Code), Agile delivery.
- Design & Architect: Design robust dimensional data models and domain-specific data pipelines. Implement standard FinOps and security patterns defined by senior architects.
- Develop & Deliver: Configure continuous data ingestion (Snowpipe/Streams), write modular transformations using dbt and Python (Snowpark), and build task orchestrations.
- Guide & Govern: Review code and data models for team members, ensuring adherence to CI/CD standards and SQL best practices.
- Operate & Optimize: Identify and rewrite bottleneck queries, optimize micro-partition clustering, and handle L3 technical escalations for pipeline failures.
- Bachelor’s degree in Computer Science, Data Engineering, or a related technical field.
- 6–8 years of IT/Data experience, with 3–5 years specifically in deep Snowflake development and pipeline architecture.
- Strong expertise in Snowflake core architecture, caching layers, and warehouse sizing.
- Deep proficiency in Advanced SQL, Python, and data modeling (Dimensional/Kimball).
- Hands-on experience with dbt, Airflow, and CI/CD pipelines for database deployments.
If you want to know the heart of a company, take a look at their values. Ours unite us. They are what drive our success – and the success of our customers. Does your heart beat like ours? Find out here: Core Values
All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status.
Required Skills
Quick Tip
Customize your resume and cover letter to highlight relevant skills for this position to increase your chances of getting hired.
Related Similar Jobs
View All
Senior Architect
SUYUG INFRA
4–8 years
Sales Operations
Adobe Illustrator
B2B Sales
+1
Marketing Manager
DATOMS
4–8 years
Sales Operations
FRM
Human Resources Executive
YANTRASHILPA TECHNOLOGIES PRIVATE LIMITED
Hyderabad
Full-Time
1–2 years
Sales Operations
FRM
Analyst GFA
Eaton
4–8 years
Sales Operations
FRM
Senior ETL Developer
Johnson Controls
Mumbai
Full-Time
4–8 years
Sales Operations
FRM
Share
Quick Apply
Upload your resume to apply for this position