Job Title: DevOps Specialist
Location: Nagpur / India / Remote
Job Summary
We are seeking a skilled DevOps Developer to join our team and drive the development, deployment, and optimization of cloud-native applications on AWS.
Key Responsibilities
- Design and implement scalable, secure, and resilient cloud-native applications using Google Cloud/AWS.
- Design and manage Data Lake environments for large-scale data ingestion, processing, and analytics.
- Design and implement CI/CD pipelines using DevOps, GitHub Actions,
- Develop and deploy cloud applications using AWS/Google-cloud services l
- Automate infrastructure provisioning with tools like Terraform, ARM templates, or Bicep
- Monitor and optimize cloud environments using AWS Monitor, Application Insights, and Log Analytics
- Collaborate with development and operations teams to streamline release cycles and improve system reliability
- Troubleshoot and resolve issues in cloud infrastructure and application deployments|
Required Skills & Qualifications
- Strong experience with Google Cloud/AWS services and cloud architecture
- Proficiency in DevOps tools: AWS DevOps, Git, Docker, Kubernetes, Jenkins
- Knowledge of infrastructure as code (IaC) and automation scripting (PowerShell, Bash, Python)
- Experience designing and maintaining robust ETL pipelines for ingesting and transforming large-scale, real-time datasets from APIs (e.g., Vortexa, Kpler, AIS/ship tracking, Market Pricing data).
- Familiarity with supporting data science workflows, including integration with Python-based analytics environments (e.g., Jupyter, Databricks).
- Experience working with time-series data, geospatial data, and event-driven architectures for real-time tracking and alerting.
- Familiarity with monitoring and logging tools in Google Cloud/AWS
- Exposure monitoring and logging tools like Prometheus, Grafana, or ELK Stack.
- Experience with agile methodologies and collaborative development environments
- Familiarity with GitOps, DevSecOps practices, Zero Trust Architecture, and policy-as-code.
- Knowledge of data lakehouse architecture, Delta Lake, or Apache Spark on AWS.
- Experience with hybrid cloud environments.
- Bachelors degree in computer science, Engineering, or related field.
- Skills to support PCI DSS certification