Job Title: Senior DevOps Engineer – Data Platforms
Location: Bangalore (Whitefield)
Employment Type: Full-Time
Experience: 7+ years in Data Engineering (3+ years in large-scale streaming platforms)
Notice- Immediate to 30 Days.
Role Overview
We are seeking a Senior DevOps Engineer to design, build, and maintain next-generation streaming data pipelines. This is a hands-on role requiring strong expertise in DevOps, CI/CD, Observability, and AWS Cloud, with working knowledge of Big Data technologies like Spark and Scala.
Key Responsibilities
- DevOps & Infrastructure
- Implement DevOps best practices for data pipelines.
- Design and maintain CI/CD pipelines (Jenkins, GitLab CI, AWS Code Pipeline).
- Manage cloud resources using IaC tools (Terraform, AWS CloudFormation).
- Containerize applications using Docker and Kubernetes.
- Observability & Operations
- Set up monitoring, alerting, and logging for pipelines and infrastructure.
- Implement observability tools (Prometheus, Datadog).
- Define SLIs/SLOs and ensure operational excellence.
- Troubleshoot and resolve production issues proactively.
- Data Engineering (Good to Have)
- Build and optimize streaming pipelines (Kafka, Spark Streaming, Flink).
- Develop ETL/ELT jobs and ensure data quality.
- Work with AWS services (Kinesis/MSK, S3, Glue, EMR, Lambda, Redshift).
Must-Have Skills
- Strong experience in DevOps, CI/CD, Terraform, Docker, Kubernetes.
- Expertise in Observability tools (Prometheus, Datadog).
- Hands-on with AWS Cloud services.
- Programming in Python/Scala/Java.
- Knowledge of SQL and relational/NoSQL databases.
Preferred Skills
- Experience with Apache Airflow, data governance, and security best practices.
- Familiarity with media/streaming industry.
- Degree in Computer Science or related field.