Bestkaam Logo
Milliman Logo

Data Engineer

Gurgaon, Haryana, India

2 months ago

Applicants: 0

Salary Not Disclosed

2 weeks left to apply

Job Description

About The Company Independent for over 70 years, Milliman delivers market-leading services and solutions to clients worldwide. Today, we are helping companies take on some of the worlds most critical and complex issues, including retirement funding and healthcare financing, risk management and regulatory compliance, data analytics and business transformation. Through a team of professionals ranging from actuaries to clinicians, technology specialists to plan administrators, we offer unparalleled expertise in employee benefits, investment consulting, healthcare, life insurance, and financial services, and property and casualty insurance. The Department The Life & Annuity Predictive Analytics (LAPA) business unit is a lean, agile, diverse, and geographically distributed data science startup within Milliman. Our team consists of professionals with varied backgrounds including data scientists, data engineers, software engineers/developers, and actuarial domain experts. We help insurers and distributors of life and retirement products to understand and use their own data, industry data, and customer data to advance their competitive position and improve financial outcomes. Through our powerful combination of subject matter expertise, data management, and advanced analytics, we provide our clients with tools to analyze their business performance, manage risk, and generate new business leads to facilitate more profitable growth. The Role As a Data Engineer on the LAPA team, you will be responsible for designing and implementing data pipelines using industry-leading cloud applications such as Databricks and orchestration tools such as Azure Data Factory. You will use programming languages such as Python, R, or SQL to automate the ETL, analytics, and data quality processes from the ground up. You will design and implement complex data models, metadata, build reports and dashboards, and own data presentation and dashboarding tools for the end users of our data products and systems. You will work with leading edge technologies like Databricks, Azure Data Lake, Azure Data Factory, Snowflake, and more. You will write scalable, highly tuned SQL/Pyspark code running over millions of rows of data. You will work closely with other data scientists, data engineers, software engineers/developers, and domain experts to continuously improve our data collection, data cleaning, data analysis, predictive modeling, data visualization, and application development. You will also investigate, evaluate, and present new technologies and processes for the team to use. You Will Design, build, and manage reliable ETL pipelines using PySpark and Databricks for life and annuity data products. Implement automated data quality checks to ensure accuracy, completeness, and consistency of data Deploy data pipelines to production and monitor them for performance, reliability, and data issues Collaborate with actuaries, analysts, and data scientists to deliver clean, usable, and secure data Support AI and machine learning teams by preparing model-ready datasets and contributing to data-driven use cases Follow engineering best practices like code reviews, automation, and efforts to reduce technical debt. Document data workflows, business logic, and best practices to support internal knowledge sharing Job Knowledge, Experience Skills Job Knowledge Required Bachelor's degree in computer science, Engineering, or any STEM-related field 3-5 years of hands-on experience in data engineering or data science roles Strong programming skills in Python, PySpark, and optionally R Proficient in SQL, including data modeling, performance tuning, and query optimization Experience building ETL/ELT pipelines and implementing data quality checks Hands-on expertise with Apache Spark, Databricks, and cloud data tools (preferably Azure Data Factory, Data Lake, Synapse) Familiarity with cloud data warehouses and large-scale data processing Understanding of DevOps practices and use of version control tools like Git in data engineering workflows Knowledge of data governance, metadata management, and secure handling of PII data Basic understanding of AI/ML concepts and how data engineering supports AI-driven use cases Experience And Soft Skills Required Passion for technology, growth, self-motivated, energetic, organized, driven, and result oriented. Ability to work in a highly collaborative, Agile environment with a strong desire to learn Ability to take ownership of a technical challenge and see it through to a successful conclusion Commitment to continuous education to be equipped to lead continuous process improvement Excellent written and verbal communication skills Ability to manage competing priorities and deadlines Additional Knowledge And Skills To Build Sharp critical thinking skills, sound judgment and decision-making ability, and both the ability and willingness to clearly articulate your ideas Experience in ETL optimization, writing custom pyspark functions (UDFs) and tuning PySpark or Spark SQL code Experience in using DAG orchestration with tools like Data Factory, Airflow, dbt, Delta Lake, Kafka, Prefect etc. Experience in handling data - data lineage, data governance, ensuring data quality, feature stores etc. Knowledge of data engineering best practices and in using industry-standard methodologies. Experience with CI/CD pipelines, Git, and DevOps practices. Interest in building AI driven solutions.

Additional Information

Company Name
Milliman
Industry
N/A
Department
N/A
Role Category
N/A
Job Role
Mid-Senior level
Education
No Restriction
Job Types
Hybrid
Employment Types
Full-Time
Gender
No Restriction
Notice Period
Less Than 30 Days
Year of Experience
1 - Any Yrs
Job Posted On
2 months ago
Application Ends
2 weeks left to apply

Similar Jobs

Ciklum

2 months ago

Senior Data Engineer

Ciklum

Landis+Gyr

4 weeks ago

Senior Engineer - Firmware Testing and Validation

Landis+Gyr

Majid Al Futtaim

2 months ago

Manager - Data Engineering

Majid Al Futtaim

Tata Consultancy Services

2 months ago

GCP Data Engineer

Tata Consultancy Services

Christy Media Solutions

2 months ago

DevOps Engineer

Christy Media Solutions

Candescent

13 hours ago

Software Dev Ops Engineer II - Cloud Platform

Candescent

Gateway Software SolutionS

4 weeks ago

Python Development Internship in Erode, Coimbatore, Chennai, Theni, Pollachi, Tirunelveli, Madurai, Namakkal, Viluppuram, Salem, Chengalpattu

Gateway Software SolutionS

Innefu Labs

4 weeks ago

Senior Python Developer

Innefu Labs

Talentgigs

6 days ago

Azure Data Engineer

Talentgigs

Accenture in India

4 weeks ago

Clinical Data Svs Analyst

Accenture in India