GCP Data Architect & Teradata Developer
India, Tamil Nadu, Chennai
1 week ago
Applicants: 0
Share
2 weeks left to apply
Job Description
Company Description Zilo AI is a prominent manpower service provider dedicated to connecting businesses with highly skilled and dependable professionals across various industries. With a deep understanding that talent drives success, Zilo AI is committed to supplying the right expertise to support business growth. Our focus on talent optimization ensures that clients receive tailored solutions to meet their business needs. Joining Zilo AI provides an opportunity to work within a forward-thinking organization that values innovation and excellence. We Are Hiring: GCP Data Architect & Teradata Developer Location: Chennai, Hyderabad & Kolkata (On-site) We are expanding our data engineering and analytics capabilities and looking for highly skilled professionals to join our growing team. If you have the passion to build scalable, cloud-native, and enterprise-grade data solutions, we would like to hear from you. Role 1: GCP Data Architect Experience: 10+ years Key Responsibilities -Architect, design, and implement end-to-end Data Analytics solutions on Google Cloud Platform (GCP). -Develop scalable data ingestion frameworks using Cloud Functions, Dataflow, Pub/Sub, Airflow, and Data Fusion. -Build and manage semantic data layers for analytics and business intelligence. -Lead data architecture initiatives across transactional systems and cloud/on-premise data warehouses. -Drive best-fit solutions for ETL, data modeling, and data storage aligned with business needs. -Define and maintain data governance, data catalogue, business glossary, and lineage standards. -Oversee the complete lifecycle of data?from operational needs to strategic/BI consumption. -Deliver architectural guidance on enterprise data strategies, including columnar databases and big data platforms. What We?re Looking For -Minimum 10 years of experience in data architecture, data modelling, and data governance. -Strong expertise in GCP data services: BigQuery, Pub/Sub, Cloud Run, Cloud Functions, Airflow, Dataflow, and Data Fusion. -Hands-on experience designing and delivering analytic workloads on cloud platforms. -In-depth knowledge of data warehousing concepts, normalized vs. denormalized models, and centralized vs. federated architectures. -Minimum 2 years? experience with Erwin or similar data modelling tools. -Strong understanding of CMI industry data structures, preferably with exposure to SID domain business models. -Ability to architect scalable, high-performance, cloud-native data ecosystems. Role 2: Teradata Developer Experience: 6?8 years Key Responsibilities -Develop, maintain, and optimize Teradata SQL queries, BTEQ scripts, and stored procedures. -Perform performance tuning, data validation, troubleshooting, and query optimization. -Work extensively with Teradata ETL utilities including FastLoad, MultiLoad, TPT, and FastExport. -Analyze source-to-target mappings (STM) and convert business requirements into robust technical designs. -Optimize long-running queries using indexing, statistics, explain plans, and partitioning techniques. What We?re Looking For -6?8 years of hands-on experience working with Teradata. -Strong SQL expertise and deep knowledge of Teradata utilities and performance tuning. -Ability to translate business requirements into efficient technical solutions. -Experience working in complex data warehouse or enterprise data environments. How to Apply We?d love to review your application. Please apply with your updated resume by filling up this short Form (Mandatory) Form Link: https://forms.gle/rxLREHQEaigGzbT77
Required Skills
Additional Information
- Company Name
- Zilo AI
- Industry
- N/A
- Department
- N/A
- Role Category
- Data Analyst
- Job Role
- Mid-Senior level
- Education
- No Restriction
- Job Types
- On-site
- Gender
- No Restriction
- Notice Period
- Immediate Joiner
- Year of Experience
- 1 - Any Yrs
- Job Posted On
- 1 week ago
- Application Ends
- 2 weeks left to apply
Similar Jobs
Quick Apply
Upload your resume to apply for this position