Office Location
Arekere, Bengaluru
Map
Role & Responsibilities
- Design, build, and maintain scalable data pipelines and ETL processes to handle data from multiple sources.
- Implement and manage data infrastructure on AWS, ensuring high availability, scalability, and security.
- Develop and optimize basic SQL queries and scripts to ensure efficient data retrieval and transformation.
- Collaborate with stakeholders to understand data requirements and deliver solutions that meet business needs.
- Help automate data workflows and pipeline deployments using DevOps tools and best practices.
- Monitor and troubleshoot data pipelines, ensuring timely resolution of issues and continuous improvement.
- Maintain documentation of data workflows, processes, and infrastructure.
- Implement basic data security and compliance best practices.
- Work with MongoDB for specific data storage and retrieval tasks, as needed.
Ideal Candidate
- At least 1 year of experience as a DevOps Engineer.
- Basic proficiency in Python for data processing and automation.
- Exposure to AWS services such as S3, Redshift, Lambda, Glue, and others related to data engineering.
- Basic knowledge of MongoDB, NoSQL, or SQL databases is a plus.
- Strong problem-solving skills and attention to detail.
Good To Have
- Experience with containerization and orchestration tools such as Docker and Kubernetes.
- Familiarity with other cloud platforms (e.g., Google Cloud Platform).
- Knowledge of data visualization tools and techniques.
Skills: devops,aws,aws lambda