Bestkaam Logo
Acuity Analytics Logo

Python Developer

Actively Reviewing the Applications

Acuity Analytics

Chennai Full-Time 4–8 years
Posted 3 days ago Apply by June 11, 2026

Job Description

We are hiring multiple 'Senior Python Developers' for our Data and Technology Services Team at Bengaluru, Gurgaon and Pune (hybrid: 2-3 days WFO).


'Immediate Joiners are highly preferred!' (0-30 days)


What you’ll do:

1) Python engineering & reusable frameworks

  • Build modular Python packages (data processing, API clients, orchestration adapters), publishable to internal artifact repositories; enforce code quality, testing, and documentation standards.
  • Develop backend services/APIs (e.g., FastAPI/Flask) and CLI tools to support front‑end, middle‑tier, and cloud workflows; implement resilient error handling, observability hooks, and secure secrets usage.

2) Data modeling, SQL & stored procedures

  • Design relational schemas and write/optimize complex SQL (windowing, CTEs, partitioning); author and refactor stored procedures (SQL Server/Oracle/Postgres) with attention to edge cases and performance.
  • Build data‑validation utilities that compare large datasets across environments and produce diffs for regression packs.

3) Post‑trade domain, test automation & regression

  • Map post‑trade flows (allocations, clearing/settlement, confirmations, reconciliations) into datasets, rules, and assertions for repeatable regression.
  • Read and translate stored‑proc logic into automated test scripts; build a central repository of reusable checks integrated into CI/CD.
  • Nice to have: familiarity with trade/blotter platforms (e.g., ION) or similar post‑trade systems.

4) CI/CD, DevOps & environment promotion

  • Design and operate multi‑stage CI/CD pipelines (Azure DevOps/Jenkins/GitLab) for code, data artifacts, and SQL deployables; implement approvals, rollbacks, environment strategy, and quality gates (lint, SAST/DAST, tests).
  • Containerize services where appropriate; integrate with AKS/Kubernetes or serverless jobs, and wire up metrics/alerts for runtime health.

5) Scheduling, batch & operationalization

  • Script the execution of AutoSys/batch jobs and stored procedures across dev/UAT/prod; add run‑books, logging, and guardrails; enable reliable, auditable promotions through environments.


What you’ll bring:

  • 4+ years of hands‑on software engineering with Python (incl. packaging, virtual envs, unit/integration testing); strong use of libraries such as pandas, SQLAlchemy/pyodbc, and asyncio/celery for pipelines and services.
  • Expert SQL skills and stored‑proc development (SQL Server/Oracle/Postgres), query tuning, and execution‑plan analysis for large datasets.
  • Proven experience designing CI/CD pipelines and automating promotion (code + data + DB objects) with Azure DevOps/Jenkins/GitLab; strong Git practices and code‑review hygiene.
  • Comfort with schedulers (AutoSys/Control‑M/Airflow) and shell/Python scripting for batch orchestration; familiarity with secrets management and environment configuration.
  • Domain understanding of post‑trade data flows and how to encode them into repeatable regression checks.


How you’ll work:

  • Collaborate with BAs, QEs, and traders/ops to clarify requirements and co‑design testable acceptance criteria; maintain living technical documentation and run‑books.
  • Champion reusability: templates, golden pipelines, sample datasets, and coding standards that scale across engagements.


Interested candidates, please share your updated CVs at [email protected]

Check Qualification

Quick Tip

Customize your resume and cover letter to highlight relevant skills for this position to increase your chances of getting hired.