Bestkaam Logo
Mulya Technologies Logo

Principal DevOps / Edge AI Engineer

Bengaluru

2 months ago

Applicants: 0

Salary Not Disclosed

5 days left to apply

Job Description

Principal DevOps / Edge AI Engineer Bangalore Founded in 2023,by Industry veterans HQ in California,US We are revolutionizing sustainable AI compute through intuitive software with composable silicon Principal DevOps / Edge AI Engineer Overview: You will be responsible for building, deploying, and maintaining the local infrastructure that powers high-performance multimodal AI models (text, image, audio, video) on a compact AI appliance. You?ll bridge the gap between hardware, ML inference, and user-facing applications - ensuring reliability, scalability, and efficiency of on-device AI workloads. Key Responsibilities: System Deployment & Orchestration Containerize AI inference services and web applications using Docker or Podman. Design lightweight orchestration layers for local systems (Kubernetes, Nomad, or custom orchestration). Automate build, test, and deployment pipelines (CI/CD) for local-first AI workloads. Performance Optimization & Resource Management Optimize compute utilization for concurrent multimodal workloads. Develop monitoring tools for system health, thermal management, and memory/bandwidth usage. Tune OS, drivers, and I/O subsystems for maximum throughput and low latency. Edge Infrastructure & Networking Configure low-latency local networking for browser-based access to the AI appliance. Set up secure local APIs and data isolation layers ? ensuring zero external data leakage. Integrate hardware accelerators and manage firmware updates across different SKUs. Reliability, Testing, and Scaling Build test harnesses to validate multimodal model performance (e.g., LLM + diffusion + ASR pipelines). Implement over-the-air (OTA) update mechanisms for edge devices without exposing user data. Develop monitoring dashboards and alerting for real-time performance metrics. Required Qualifications: Strong background in Linux systems engineering and containerization (Docker, Podman, LXC). Experience deploying AI inference services locally or at the edge (llama.cpp, ollama, vLLM, ONNX). Proficiency in CI/CD tools (GitHub Actions, Jenkins, ArgoCD) and infrastructure-as-code (Terraform, Ansible). Expertise in GPU/accelerator optimization, CUDA stack management, or similar. Solid understanding of networking, security, and firewall configurations for local appliances. Scripting and automation skills (Python, Bash, Go, or Rust). Preferred Qualifications: Experience with embedded systems or edge AI devices (e.g., Jetson, Coral, FPGA-based accelerators). experience minimum 10 years Familiarity with low-bit quantization, model partitioning, or distributed inference. Background in hardware/software co-design or systems integration. Knowledge of browser-based local apps (WebSocket, WebRTC, RESTful APIs) and AI service backends. Prior work in privacy-preserving AI systems or local-first architectures. Contact: Uday Mulya Technologies [email protected] "Mining The Knowledge Community"

Additional Information

Company Name
Mulya Technologies
Industry
N/A
Department
N/A
Role Category
N/A
Job Role
Mid-Senior level
Education
No Restriction
Job Types
Remote
Employment Types
Full-Time
Gender
No Restriction
Notice Period
Immediate Joiner
Year of Experience
1 - Any Yrs
Job Posted On
2 months ago
Application Ends
5 days left to apply

Similar Jobs

Virtusa

1 week ago

Power Automate

Virtusa

Take-Two Interactive

2 weeks ago

Senior DevOps Engineer

Take-Two Interactive

Siemens

2 months ago

DevOps Engineer

Siemens

EXL

2 weeks ago

4315078-Senior Manager

EXL

Python, Data, NLP +2
Turing

1 week ago

Backend Python Engineer

Turing

hackajob

2 weeks ago

Software Engineer II

hackajob

Mindrift

2 months ago

Freelance Chemistry Expert - Quality Assurance (AI Trainer)

Mindrift

Driffle

2 months ago

Data Analyst

Driffle

Canonical

2 weeks ago

Security Software Engineer

Canonical

Zendesk

1 week ago

Senior Software Engineer

Zendesk