Job Title: Senior AWS Data & DevOps Engineer
Location: Bangalore (Hybrid)
Experience: 5+ Years
The Role
We are looking for a hybrid Data & DevOps Engineer to bridge the gap between cloud infrastructure and data platforms. You will manage high-availability AWS environments while building scalable ETL pipelines using AWS Glue (PySpark).
Core Responsibilities
- Infrastructure: Design and manage secure, scalable AWS environments (VPC, EC2, S3, RDS, IAM).
- Data Engineering: Develop AWS Glue ETL jobs (PySpark), manage Data Catalogs, and optimize S3 Data Lakes.
- Automation: Build CI/CD pipelines and manage Infrastructure as Code (Terraform/CDK).
- Containers: Package and deploy workloads using Docker and EKS/ECS.
- Operations: Implement CloudWatch monitoring, troubleshoot Glue job failures, and optimize for cost/performance.
Required Qualifications
- 5+ years in AWS/DevOps or Cloud Engineering.
- Expert PySpark/Python skills for AWS Glue development.
- Strong IaC experience (Terraform, CloudFormation, or CDK).
- Deep AWS Knowledge: S3, IAM, VPC, and CloudWatch.
- CI/CD Tools: GitHub Actions, Jenkins, or GitLab CI.