Solution Architect - Data Engineering
Actively Reviewing the ApplicationsYO IT Consulting
India, Ahmedabad, Gujarat, Chennai
Full-Time
Posted 3 days ago
•
Apply by June 20, 2026
Job Description
Title: Data Engineering Solutions Architect
Experience: 8-12 Years
Location: Ahmedabad
This is 5 days work from office role
Office Location- Ambawadi, Ahmedabad
Required Qualifications
We are looking for an experienced Data Engineering Solutions Architect to join our growing Data Practice. The ideal candidate will have 8-12 years of hands-on experience designing, architecting, and delivering large-scale data warehousing, data lake, ETL, and reporting solutions across modern and traditional data platforms. You will play a key role in defining scalable, secure, and cost-effective architectures that enable advanced analytics and AI-driven insights for our clients.
This role demands a balance of technical depth, solution leadership, and consulting mindset - helping customers solve complex data engineering challenges while also building internal capability and best practices within the organization.
Experience: 8-12 Years
Location: Ahmedabad
This is 5 days work from office role
Office Location- Ambawadi, Ahmedabad
Required Qualifications
- 8-12 years of experience in data engineering and architecture, including hands-on solution delivery.
- Deep expertise with Snowflake or Databricks, with strong working knowledge of tools like dbt, Matillion, SQL, and Python or PySpark.
- Experience designing and implementing data pipelines and orchestration using tools like Airflow, Control-M, or equivalent.
- Familiarity with cloud-native data engineering services (Such as AWS Glue, Redshift, Athena, GCP BigQuery, Dataflow, Pub/Sub, etc.) or similar.
- Strong understanding of data modelling, ELT/ETL design, and modern architecture frameworks (medallion, layered, or modular architectures).
- Experience integrating and troubleshooting APIs and real-time data ingestion technologies (Kafka, Kinesis, Pub/Sub, REST APIs).
- Familiarity with traditional ETL and data integration tools (Informatica, SSIS, Oracle Data Integrator, etc.).
- Excellent understanding of data governance, performance tuning, and DevOps for data (CI/CD, version control, monitoring).
- Strong communication, problem-solving, and stakeholder management skills.
- Certifications such as:
- Snowflake SnowPro, Databricks Certified Architect, AWS Data Analytics Specialty, or Google Professional Data Engineer.
- Prior consulting or client-facing experience.
- Exposure to AI/ML, data quality, or metadata management frameworks.
- Experience leading solution design across multi-cloud or hybrid environments.
- Design and architect end-to-end data solutions using technologies like Snowflake, Databricks, dbt, Matillion, Python, Airflow, Control-M, and cloud-native services on AWS/Azure/GCP.
- Define and implement data ingestion, transformation, integration and orchestration frameworks for structured and semi-structured data.
- Architect data lakes and data warehouses with an emphasis on scalability, cost optimization, performance, and governance.
- Support real-time and API-based data integration scenarios; design solutions for streaming, micro-batch, and event-driven ingestion.
- Lead design and delivery of data visualization and reporting solutions using tools such as Power BI, Tableau, and Streamlit.
- Collaborate with business and technical stakeholders to define requirements, design architecture blueprints, and ensure alignment with business objectives.
- Establish and enforce engineering standards, frameworks, and reusable assets to improve delivery efficiency and solution quality.
- Mentor data engineers and help build internal capability on emerging technologies.
- Provide thought leadership around modern data platforms, AI/ML integration, and data modernization strategies.
We are looking for an experienced Data Engineering Solutions Architect to join our growing Data Practice. The ideal candidate will have 8-12 years of hands-on experience designing, architecting, and delivering large-scale data warehousing, data lake, ETL, and reporting solutions across modern and traditional data platforms. You will play a key role in defining scalable, secure, and cost-effective architectures that enable advanced analytics and AI-driven insights for our clients.
This role demands a balance of technical depth, solution leadership, and consulting mindset - helping customers solve complex data engineering challenges while also building internal capability and best practices within the organization.
Required Skills
Machine Learning
Data Analysis
Python
SQL
Data Modeling
AWS
Data Integration
Oracle Database
Snowflake
BigQuery
Microsoft Azure
Google Cloud Platform
Power BI
Tableau
Data Warehousing
REST API
Data Visualization
Data Governance
Amazon Redshift
Apache Kafka
Databricks
Data Lake
ETL
Informatica
SSIS
DevOps
CI/CD
Cloud native
Apache Airflow
DBT
Performance optimization
Version control
Data ingestion
Event-driven architecture
Adobe Illustrator
Data platforms
Data pipelines
Schema markup
PySpark
Amazon Kinesis
Amazon Athena
AWS glue
Metadata management
ELT
Streamlit
Computer Science
Object-oriented programming
Data Structures
Apache Kafka
Unit testing
Process control
Application development
Project Planning
C++
Python
Bash
Linux
Multithreading
Windows
Kubernetes
NATS
Metrology
STL
GDB
OOD
OOP
UML
Quick Tip
Customize your resume and cover letter to highlight relevant skills for this position to increase your chances of getting hired.
Share
Quick Apply
Upload your resume to apply for this position