GCP Data Engineer
India, Maharashtra, Pune
1 week ago
Applicants: 0
Share
2 weeks left to apply
Job Description
About Client :- Our client is a French multinational information technology (IT) services and consulting company, headquartered in Paris, France. Founded in 1967, It has been a leader in business transformation for over 50 years, leveraging technology to address a wide range of business needs, from strategy and design to managing operations. The company is committed to unleashing human energy through technology for an inclusive and sustainable future, helping organizations accelerate their transition to a digital and sustainable world. They provide a variety of services, including consulting, technology, professional, and outsourcing services. Job Details:- location : Pune Mode Of Work : Hybrid Notice Period : Immediate Joiners Experience : 6-9 yrs Type Of Hire : Contract to Hire Primary Skills :- GCP Data Engineer JD Expectations: In this role, you will: Bachelor?s/Master's degree in Computer Science, or related field. Understanding of Big Data technologies and solutions ( Spark, Hadoop ) and multiple scripting and languages ( Java, Python ). Understanding of Google Cloud Platform ( GCP ) technologies in the big data and data warehousing space (BigQuery, Cloud Data Fusion, Dataproc , Dataflow ,Composer, DAG, Airflow, Complex SQL , stored procedures).Demonstrable track record of dealing well with ambiguity, prioritizing needs, and delivering results in a dynamic environment. Excellent verbal and written communication skills with the ability to effectively advocate technical solutions to research scientists, engineering teams and business audiences.Should have minimum of 5+ years of experience in Google Cloud Data Engineering. Provide technical expertise in the design and development using Google Cloud.Develops and/or re-engineers highly complex application components, and integrate software packages, programs and reusable objects residing on multiple platforms. Development experience on Google cloud platform to implement streaming analytics for stream and batch processing. GCP Certification is preferable. Event-driven cloud platform for cloud services and apps, Data integration for building and managing pipeline, Data warehouse running on server less infrastructure, Workflow orchestration using Google Cloud Data Engineering components DataFlow, Data proc, Compute Engine, Cloud Composer ,Run, Big Query, etc Have clear work experience on Python with Apache beam farmwork to build dataflow job custom template, In depth knowledge on Dataflow job autoscaling, various runtime parameters for dataflow jobs Requirements : To be successful in this role, you should meet the following requirements: Design, build and deploy internal applications to support our technology life cycle, collaboration and spaces, service delivery management, data and business intelligence among others. Work closely with analysts and business process owners to translate business requirements into technical solutions. Build internal solutions, with custom front ends (web, mobile) and backend services that automate business processes. Maintain highest levels of development practices including: technical design, solution development, systems configuration, test documentation/execution, issue identification and resolution, writing clean, modular and self-sustaining code, with repeatable quality and predictability
Additional Information
- Company Name
- People Prime Worldwide
- Industry
- N/A
- Department
- N/A
- Role Category
- Data Engineer
- Job Role
- Mid-Senior level
- Education
- No Restriction
- Job Types
- Remote
- Gender
- No Restriction
- Notice Period
- Immediate Joiner
- Year of Experience
- 1 - Any Yrs
- Job Posted On
- 1 week ago
- Application Ends
- 2 weeks left to apply
Similar Jobs
Quick Apply
Upload your resume to apply for this position