We are seeking a highly skilled Senior DevOps Engineer with strong expertise in Databricks platform operations, Infrastructure as Code, and DataOps automation to support enterprise-scale environments within highly regulated industries.
This role focuses on automation-first engineering, platform governance, and building secure, scalable data infrastructure.
Key Responsibilities
CI/CD & Automation
- Design and implement end-to-end CI/CD pipelines using GitHub Actions or Azure DevOps for seamless deployment of data assets.
- Automate Databricks workspace provisioning and environment setup across Dev, UAT, and Production environments.
- Automate Databricks workflows and job orchestration.
- Integrate automated testing (pytest, Spark testing frameworks) into deployment pipelines.
Infrastructure as Code (IaC)
- Develop and maintain Infrastructure as Code using Terraform for scalable and consistent deployments.
- Implement zero-touch environment provisioning and multi-workspace automation.
- Establish cluster policies and compute governance standards.
Governance, Security & Compliance
- Enforce platform security and governance, including Unity Catalog and access controls.
- Implement fine-grained access control and secure data governance practices.
- Collaborate with security teams to ensure compliance with enterprise and financial regulations.
- Configure and manage:
- Private Link / VNet Injection
- IP Access Lists
- SCIM and Identity Provider (IdP) integrations
Observability & Cost Optimization
- Implement observability frameworks for monitoring, alerting, and performance tracking.
- Monitor platform usage and optimize cloud costs and DBU consumption.
- Leverage Databricks system tables and audit logs.
- Build real-time monitoring dashboards and usage guardrails.
Experience Requirements
- 3–6+ years of experience in DevOps or Platform Engineering.
- Minimum 1+ year of hands-on Databricks platform experience in enterprise environments.
- Proven experience working in large-scale regulated industries (banking, financial services, etc.).
- Strong exposure to multi-workspace automation and platform standardization.
Required Certifications
- Databricks Certified Data Engineer Professional (Mandatory)
- HashiCorp Certified: Terraform Associate (Required)
- One of the following cloud certifications:
- Azure DevOps Engineer Expert
- AWS DevOps Engineer – Professional
Core Technical Skills
Infrastructure as Code & Automation
- Advanced expertise in Terraform, including Databricks provider
- Experience with cluster policies and compute governance
CI/CD & DataOps
- Hands-on experience with Databricks Asset Bundles (DABs)
- Experience integrating automated testing into pipelines
Governance & Security
- Experience implementing Unity Catalog and secure access models
- Strong knowledge of secure cloud networking and identity integrations
Observability & Cost Management
- Experience with monitoring frameworks, dashboards, and cost optimization strategies.
Preferred Qualifications
- Strong understanding of Apache Spark internals for performance troubleshooting.
- Advanced Python scripting skills for automation and tooling.
- Experience in financial services or regulated environments.
- Strong understanding of secure cloud architecture and networking.