Data Platform Engineer (Snowflake, Python, Pipelines)
Bangalore Urban, Karnataka, India
5 days ago
Applicants: 0
2 months left to apply
Job Description
Location: Any of the following ? Bangalore, Mumbai, Navi Mumbai, Ahmedabad, Chennai, Coimbatore, Gurgaon, Hyderabad, Kochi, Kolkata, Noida, Pune, or Thiruvananthapuram Employment Type: Full-time (Hybrid Model ? 3 Days in Office) Compensation Range: USD 76,000 ? 112,000 per annum Domain: Information Technology (IT) Interview Process: 2 Technical Rounds + 1 Client Round Role Overview We are looking for a Data Warehouse Data Engineer to lead and support the migration of an enterprise data warehouse from SQL Server to Snowflake. The ideal candidate will design optimized dimensional models aligned with Power BI?s Semantic Layer and develop scalable data marts to support evolving business needs. Key Responsibilities Migrate data warehouse objects from SQL Server to Snowflake ensuring performance, scalability, and cost-efficiency. Design and implement dimensional data models (star/snowflake schemas) optimized for BI layers (Power BI). Build and maintain ETL/ELT pipelines using Databricks, DBT, and Snowflake. Develop new data marts for additional domains such as Product Development and evolving datasets. Perform query tuning, warehouse optimization, and ensure alignment with semantic models. Implement data quality checks, lineage documentation, and process control measures. Collaborate closely with BI and analytics teams to deliver trusted and well-modeled datasets. Required Skills & Qualifications Proven experience in SQL Server to Snowflake migration. Strong expertise with Snowflake, Databricks, DBT, SQL, and Python. Solid understanding of dimensional modeling and Power BI semantic layer optimization. Hands-on experience building data marts for new business domains. Deep background in ETL/ELT pipeline design, data transformation, and performance tuning. Familiarity with datasets from Salesforce, Dynamics 365, Coupa, Workday, and Concur. Nice to Have Experience with AWS or Azure cloud ecosystems. Exposure to data governance, lineage tools, or metadata management. Understanding of data security and compliance frameworks. Benefits Overview Paid Leave: Minimum 10 vacation days, 6 sick days, and 10 holidays annually. Insurance: Medical, dental, and vision coverage for employees and dependents. Retirement Plan: 401(k) plan with employer matching. Additional Benefits: Company-paid life, accidental death, and disability coverage. Short/long-term disability options. Health Savings Account (HSA) and Flexible Spending Account (FSA) programs. Paid bereavement and jury duty leave. Compliance with all applicable U.S. state and local paid sick leave laws.
Required Skills
Additional Information
- Company Name
- Mogi I/O : OTT/Podcast/Short Video Apps for you
- Industry
- N/A
- Department
- N/A
- Role Category
- Ruby on Rails Developer
- Job Role
- Mid-Senior level
- Education
- No Restriction
- Job Types
- Remote
- Gender
- No Restriction
- Notice Period
- Less Than 30 Days
- Year of Experience
- 1 - Any Yrs
- Job Posted On
- 5 days ago
- Application Ends
- 2 months left to apply