Bestkaam Logo
PureSoftware Ltd Logo

Pentaho, ETL Tool

Gurgaon, Haryana, India

1 month ago

Applicants: 0

Salary Not Disclosed

1 month left to apply

Job Description

We are seeking an experienced ETL Developer with strong Pentaho Data Integration (PDI) expertise to support new solution implementations and modernization of existing data workflows. The role involves re-engineering current processes built using Shell scripts, Java, or other legacy automation tools into scalable Pentaho jobs. The candidate will also contribute to cloud migration efforts to Azure, ensuring enterprise-grade performance, security, and maintainability. Key Responsibilities: ? Design, develop, and maintain ETL workflows in Pentaho (PDI) based on existing processes in Shell scripts, Java, or other automation tools. ? Implement new, efficient, and scalable data integration pipelines in Pentaho to meet evolving business requirements. ? Analyze and reverse-engineer current data workflows to build equivalent solutions in Pentaho. ? Support migration of existing on-prem or custom data solutions to Azure Cloud, integrating with services like Azure Blob Storage, ADF, Azure SQL, Key Vault, etc. ? Work with various source and target systems such as Oracle, PostgreSQL, SQL Server, CSV, JSON, XML, APIs, etc. ? Develop parameterized, modular, and reusable Pentaho transformations and jobs. ? Perform data validation, reconciliation, error handling, and logging within the ETL framework. ? Optimize Pentaho jobs for performance and monitor scheduled job execution. ? Ensure data quality and governance, aligning with enterprise and compliance standards (e.g., GDPR, HIPAA). ? Collaborate with business analysts, architects, and data engineers to deliver solutions aligned with functional needs. ? Document ETL design, data flow, and operations for ongoing support and enhancements. ? Participate in Agile ceremonies, provide estimates, and track tasks using tools like JIRA. ? Technical Skills: ? Experience in rewriting/refactoring legacy scripts into ETL jobs using visual tools like Pentaho. ? Strong background in data processing workflows implemented in Shell scripts, Java, or similar tools. ? Hands-on experience with Azure Cloud Services relevant to data migration: o Azure Data Factory o Azure Blob Storage o Azure SQL / Synapse o Azure Key Vault / Managed Identity ? Proficient in SQL, stored procedures, and performance tuning. ? Experience with data validation, audit logging, and data quality frameworks. ? Knowledge of file-based, API-based, and database-based integration techniques. ? Version control using Git/GitLab, and awareness of CI/CD practices for ETL deployments. ? Familiarity with Agile/Scrum/SAFe methodologies, and use of JIRA/Confluence. ? Familiarity with Apache HOP, PowerBI is a plus ? Experience in data archival, purging, and retention policy implementation.

Additional Information

Company Name
PureSoftware Ltd
Industry
N/A
Department
N/A
Role Category
Cloud Engineer
Job Role
Mid-Senior level
Education
No Restriction
Job Types
Remote
Gender
No Restriction
Notice Period
Less Than 30 Days
Year of Experience
1 - Any Yrs
Job Posted On
1 month ago
Application Ends
1 month left to apply

Similar Jobs

La French Tech Taiwan

1 month ago

Golang Developer - Software Engineer

La French Tech Taiwan

LIXIL

1 month ago

Data Engineer

LIXIL

Data, SQL, Python +2
Virtusa

3 weeks ago

Java developer

Virtusa

Turing

1 month ago

Java Engineer - 20442

Turing

Luxoft

3 weeks ago

Java Software Engineer

Luxoft

Check Point Software

3 weeks ago

Full stack Java lead

Check Point Software

EY

3 weeks ago

EY-GDS Consulting-AI And DATA-Scala-Senior

EY

Lucidity

3 weeks ago

Software Development Engineer in Test (SDET)

Lucidity

EmbedTech Solutions

3 weeks ago

Senior Java Developer

EmbedTech Solutions

Cognizant

3 weeks ago

Learning Java Developer with Drools

Cognizant