Bestkaam Logo
Capco Logo

Mid-Data Engineer with Python & Snowflake - Jefferies

Pune, Maharashtra, India

3 weeks ago

Applicants: 0

Salary Not Disclosed

3 days left to apply

Job Description

Job Title: Data Engineer with Python & Snowflake- Pune About Us Capco, a Wipro company, is a global technology and management consulting firm. Awarded with Consultancy of the year in the British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar & Seramount. With our presence across 32 cities across globe, we support 100+ clients across banking, financial and Energy sectors. We are recognized for our deep transformation execution and delivery. WHY JOIN CAPCO? You will work on engaging projects with the largest international and local banks, insurance companies, payment service providers and other key players in the industry. The projects that will transform the financial services industry. MAKE AN IMPACT Innovative thinking, delivery excellence and thought leadership to help our clients transform their business. Together with our clients and industry partners, we deliver disruptive work that is changing energy and financial services. #BEYOURSELFATWORK Capco has a tolerant, open culture that values diversity, inclusivity, and creativity. CAREER ADVANCEMENT With no forced hierarchy at Capco, everyone has the opportunity to grow as we grow, taking their career into their own hands. DIVERSITY & INCLUSION We believe that diversity of people and perspective gives us a competitive advantage. Role Description Key Skills: Data Engineering, Python, Snowflake, AWS, Git/ Bitbucket Exp: 9+yrs Location ? Hinjewadi, Pune Shift timings: 12:30PM- 9:30PM 3 days WFO (Tues, Wed, Thurs) Technical Requirement Job Summary Job Description: Python & Snowflake Engineer with AI/Cortex Development 4+ years of experience in developing Data Engineering and data science projects using Snowflake/AI Cloud platform on AWS cloud. Snow Park experience preferred. Experience with different data modeling techniques is required. 4+ yrs experience with Python development. Used tools like VS Code or anaconda, version control using Git or Bitbucket and Python unit testing frameworks. Experience in building snowflake applications using Snowflake AI/Cortex platform (specifically cortex agents, cortex search and cortex LLM with understanding of context enrichment using Prompts or Retrieval-Augmented-Generation methods). Deep understanding of implementing Object oriented programming in the Python, data structures like Pandas, data frames and writing clean and maintainable Engineering code. Understanding multi-threading concepts, concurrency implementation using Python server-side python custom modules. Implementing Object-Relational mapping in the python using frameworks like SQLAlchemy or equivalent. Good at developing and deploying Python applications like lamda on AWS Cloud platform. Good at deploying web applications on AWS Cloud using docker containers or Kubernetes with experience of using CI/CD pipelines. Good at developing applications Snowpipe and Snowpark and moving the data from Cloud sources like AWS S3 and handling unstructured data from data lakes. Good at Snowflake Account hierarchy models, Account-role-permissions strategy. Good at Data sharing using preferably Internal Data Marketplace and Data Exchanges for various Listings. Good at the Data Governance/Security concepts within Snowflake, Row/Column level dynamic data masking concepts using Snowflake Tags. Good understanding of input query enrichment using Snowflake YAMLs and integrating with LLMs within Snowflake. Candidate is good at understanding of Relevance search and building custom interaction applications with LLMs. Nice to have experience in building Snowflake native applications using Streamlit and deploy onto AWS Cloud instances (EC2 or docker containers). Candidate continuously improving functionality through experimentation, performance tuning and customer feedback. Nice to have any application Cache implementation experience within Python web applications. Nice to have duckdb with Apache arrow experience. Nice to have implementing CI/CD pipelines within Snowflake applications. Good at analytical skills, problem solving and communicate technical concepts clearly. Experience using Agile and SCRUM methodologies and preferably with JIRA. If you are keen to join us, you will be part of an organization that values your contributions, recognizes your potential, and provides ample opportunities for growth. For more information, visit www.capco.com. Follow us on Twitter, Facebook, LinkedIn, and YouTube.

Additional Information

Company Name
Capco
Industry
N/A
Department
N/A
Role Category
N/A
Job Role
Mid-Senior level
Education
No Restriction
Job Types
On-site
Employment Types
Full-Time
Gender
No Restriction
Notice Period
Less Than 30 Days
Year of Experience
1 - Any Yrs
Job Posted On
3 weeks ago
Application Ends
3 days left to apply

Similar Jobs

PepsiCo

4 weeks ago

Lead Engineer - IBM Planning Analytics (TM1)

PepsiCo

BairesDev

2 months ago

Staff Python Engineer (Apache Ecosystem)

BairesDev

Standard Chartered India

3 weeks ago

DevOps

Standard Chartered India

IBM

4 weeks ago

Application Developer-AWS Cloud FullStack

IBM

D. E. Shaw India Private Limited

2 months ago

Senior Data Engineer - Equities

D. E. Shaw India Private Limited

Capgemini

3 weeks ago

Data Science Engineer

Capgemini

Bajaj Finserv

1 day ago

Assistant Manager - PL - Emerging

Bajaj Finserv

Samin TekMindz India Pvt. Ltd.

4 weeks ago

DevOps Architect

Samin TekMindz India Pvt. Ltd.

AAA Global

2 months ago

Senior Quantitative Researcher

AAA Global

EXL

2 months ago

4400359-Assistant Manager

EXL