AWS Data Engineer- Pyspark with Databricks or Snowflake
Actively Reviewing the ApplicationsITC Infotech
Job Description
Job Description Data Engineer Core Technical Skills Snowflake & Spark � Building and managing scalable data pipelines. � Spark-based transformations and ETL workflows. � Expertise in PySpark, including optimization techniques and cost management. � Snowflake-specific capabilities: o Performance tuning and query optimization. o Partitioning and clustering strategies. o Cost control and resource management. o Advanced features such as Time Travel, Zero-Copy Cloning, and Streams & Tasks for data engineering workflows. � Delta Lake concepts (ACID transactions, Z-Ordering, OPTIMIZE, VACUUM) for hybrid architectures. SQL & Relational Databases o Advanced SQL query writing. o PostgreSQL expertise (window functions, CTEs, query plans, indexing strategy). Streaming & Messaging o Apache Kafka for real-time ingestion and topic management. o Understanding of event-driven architecture. AWS & Cloud Services o Excellent in AWS Glue, Lambda, Step functions and Data Analytics services.
Quick Tip
Customize your resume and cover letter to highlight relevant skills for this position to increase your chances of getting hired.
Related Similar Jobs
View All
Customer care Support
iMarque Solutions Pvt. Ltd
Quality Assurance Lead
AideWiser SolTek
S2R_DNA_Python ML/Data Science_Q4_FY 26
Infosys
Senior Backend Engineer
Uplers
Staff Specialist Cybersecurity Engineer - India
BMC Software
Share
Quick Apply
Upload your resume to apply for this position