Data Messaging DevOps Engineer
Actively Reviewing the ApplicationsFord Motor Company
Chennai, Tamil Nadu, India
Full-Time
Remote
Posted 4 months ago
•
Apply by May 4, 2026
Job Description
Job Description
Join our team focused on Google Cloud Data Messaging Services, leveraging technologies like Pub/Sub and Kafka to build scalable, decoupled, and resilient cloud-native applications. This position involves close collaboration with development teams, as well as product vendors, to implement and support the suite of Data Messaging Services offered within GCP and Confluent Kafka.
GCP Data Messaging Services provide powerful capabilities for handling streaming data and asynchronous communication. Key benefits include:
Enabling real-time data processing and event-driven architectures
Decoupling applications for improved resilience and scalability
Leveraging managed services like Cloud Pub/Sub and integrating with Kafka environments (Apache Kafka, Confluent Cloud)
Providing highly scalable and available infrastructure for data streams
Enhancing automation for messaging setup and management
Supporting Infrastructure as Code practices for messaging components
The Data Messaging Services Specialist plays a crucial role as the corporation migrates and onboards applications that rely on robust data streaming and asynchronous communication onto GCP Pub/Sub and Confluent Kafka.
This position requires staying abreast of the continual evolution of cloud data technologies and understanding how GCP messaging services like Pub/Sub, alongside Kafka, integrate with other native services like Cloud Run, Dataflow, etc., within the new Ford Standard app hosting environment to meet customer needs.
This is an exciting opportunity to work on highly visible data streaming technologies that are becoming industry standards for real-time data processing.
Responsibilities
Develop a solid understanding of Google Cloud Pub/Sub and Kafka (Apache Kafka and/or Confluent Cloud).
Gain experience in using Git/GitHub and CI/CD pipelines for deploying messaging-related cluster and infrastructure.
Collaborate with Business IT and business owners to prioritize improvement efforts related to data messaging patterns and infrastructure.
Work with team members to establish best practices for designing, implementing, and operating scalable and reliable data messaging solutions.
Identify opportunities for adopting new data streaming technologies and patterns to solve existing needs and anticipate future challenges.
Create and maintain Terraform modules and documentation for provisioning and managing Pub/Sub topics/subscriptions, Kafka clusters, and related networking configurations, often with a paired partner.
Develop automated processes to simplify the experience for application teams adopting Pub/Sub and Kafka client libraries and deployment patterns.
Improve continuous integration tooling by automating manual processes within the delivery pipeline for messaging applications and enhancing quality gates based on past learnings.
Qualifications
Highly motivated individual with strong technical skills and an understanding of emerging data streaming technologies (including Google Pub/Sub, Kafka, Tekton, and Terraform).
Experience with Apache Kafka or Confluent Cloud Kafka, including concepts like brokers, topics, partitions, producers, consumers, and consumer groups.
Working experience in CI/CD pipelines, including building continuous integration and deployment pipelines using Tekton or similar technologies for applications interacting with Pub/Sub or Kafka.
Understanding of GitOps and other DevOps processes and principles as applied to managing messaging infrastructure and application deployments.
Understanding of Google Identity and Access Management (IAM) concepts and various authentication/authorization options for securing access to Pub/Sub and Kafka.
Knowledge of any programming language (e.g., Java, Python, Go) commonly used for developing messaging producers/consumers.
Experience with public cloud platforms (preferably GCP), with a focus on data messaging services.
Understanding of agile methodologies and concepts, or experience working in an agile environment.
Required Skills
Quick Tip
Customize your resume and cover letter to highlight relevant skills for this position to increase your chances of getting hired.
Related Similar Jobs
View All
Front-End Developer - London Stock Exchange Group
Jobs via eFinancialCareers
India
Full-Time
₹5–15 LPA
Engineering
Git
JavaScript
+7
IT Infrastructure Managed Services_Senior Associate (Sr.Analyst) - Python Developer - Operate
PwC Acceleration Center India
Bengaluru
Full-Time
Python
Data
Django
+2
Azure Data Factory (ADF) Developer/Engineer
Infosys
Bengaluru
Full-Time
Cloud
Integration
ADF
Principal Engineer - Electrical
WSP in India
Mumbai
Full-Time
Walk in Drive _Java Backend Developer_ Bangalore_8th Nov
Tata Consultancy Services
Bengaluru
Full-Time
Linux
SQL
Git
Share
Quick Apply
Upload your resume to apply for this position