Bestkaam Logo
PureSoftware Ltd Logo

Pentaho, ETL Tool

Gurgaon, Haryana, India

1 month ago

Applicants: 0

Salary Not Disclosed

1 month left to apply

Job Description

We are seeking an experienced ETL Developer with strong Pentaho Data Integration (PDI) expertise to support new solution implementations and modernization of existing data workflows. The role involves re-engineering current processes built using Shell scripts, Java, or other legacy automation tools into scalable Pentaho jobs. The candidate will also contribute to cloud migration efforts to Azure, ensuring enterprise-grade performance, security, and maintainability. Key Responsibilities: ? Design, develop, and maintain ETL workflows in Pentaho (PDI) based on existing processes in Shell scripts, Java, or other automation tools. ? Implement new, efficient, and scalable data integration pipelines in Pentaho to meet evolving business requirements. ? Analyze and reverse-engineer current data workflows to build equivalent solutions in Pentaho. ? Support migration of existing on-prem or custom data solutions to Azure Cloud, integrating with services like Azure Blob Storage, ADF, Azure SQL, Key Vault, etc. ? Work with various source and target systems such as Oracle, PostgreSQL, SQL Server, CSV, JSON, XML, APIs, etc. ? Develop parameterized, modular, and reusable Pentaho transformations and jobs. ? Perform data validation, reconciliation, error handling, and logging within the ETL framework. ? Optimize Pentaho jobs for performance and monitor scheduled job execution. ? Ensure data quality and governance, aligning with enterprise and compliance standards (e.g., GDPR, HIPAA). ? Collaborate with business analysts, architects, and data engineers to deliver solutions aligned with functional needs. ? Document ETL design, data flow, and operations for ongoing support and enhancements. ? Participate in Agile ceremonies, provide estimates, and track tasks using tools like JIRA. ? Technical Skills: ? Experience in rewriting/refactoring legacy scripts into ETL jobs using visual tools like Pentaho. ? Strong background in data processing workflows implemented in Shell scripts, Java, or similar tools. ? Hands-on experience with Azure Cloud Services relevant to data migration: o Azure Data Factory o Azure Blob Storage o Azure SQL / Synapse o Azure Key Vault / Managed Identity ? Proficient in SQL, stored procedures, and performance tuning. ? Experience with data validation, audit logging, and data quality frameworks. ? Knowledge of file-based, API-based, and database-based integration techniques. ? Version control using Git/GitLab, and awareness of CI/CD practices for ETL deployments. ? Familiarity with Agile/Scrum/SAFe methodologies, and use of JIRA/Confluence. ? Familiarity with Apache HOP, PowerBI is a plus ? Experience in data archival, purging, and retention policy implementation.

Additional Information

Company Name
PureSoftware Ltd
Industry
N/A
Department
N/A
Role Category
Data Engineer
Job Role
Mid-Senior level
Education
No Restriction
Job Types
Remote
Gender
No Restriction
Notice Period
Less Than 30 Days
Year of Experience
1 - Any Yrs
Job Posted On
1 month ago
Application Ends
1 month left to apply

Similar Jobs

Quest Software

3 weeks ago

Software Dev Principal Engineer ? Java & Cloud Engineering

Quest Software

myGwork - LGBTQ+ Business Community

1 month ago

Senior Software Engineer

myGwork - LGBTQ+ Business Community

myGwork - LGBTQ+ Business Community

3 weeks ago

Business Intelligence Engineer

myGwork - LGBTQ+ Business Community

Virtusa

1 month ago

Tech Lead-Java

Virtusa

Barclays

1 month ago

Software Engineer - Fullstack

Barclays

Accenture services Pvt Ltd

1 month ago

Software Development Engineer

Accenture services Pvt Ltd

TIGI HR

1 month ago

Java Software Engineer

TIGI HR

Java, OOP, Git +2
NextDimension AI

1 month ago

AI Software Engineer

NextDimension AI

American Express

3 weeks ago

Analyst-Control Management

American Express

Metropolis Technologies

1 month ago

Tableau Developer

Metropolis Technologies