About Birlasoft
Birlasoft, a powerhouse where domain expertise, enterprise solutions, and digital technologies converge to redefine business processes. We take pride in our consultative and design thinking approach, driving societal progress by enabling our customers to run businesses with unmatched efficiency and innovation. As part of the CK Birla Group, a multibillion-dollar enterprise, we boast a 12,500+ professional team committed to upholding the Group's 162-year legacy. Our core values prioritize Diversity, Equity, and Inclusion (DEI) initiatives, along with Corporate Sustainable Responsibility (CSR) activities, demonstrating our dedication to building inclusive and sustainable communities. Join us in shaping a future where technology seamlessly aligns with purpose.
About The Job
We are seeking a highly skilled AWS Developer to join our dynamic team. The ideal candidate will have extensive experience with AWS services, particularly Lambda, EventBridge, and SQS. This role involves designing, developing, and maintaining scalable and efficient cloud-based applications.
Title: AWS DevOps Engineer- Apache Flink
Location: Pune/Bangalore
Educational Background : Masters/Professional Degree
Job Summary
Key Responsibilities:
- Spreading the DevOps culture across business units by implementing on-commit deployment and automated testing solutions.
- Developing systems using the latest technologies to streamline the release management process into AWS.
- Obtaining an understanding of product offerings and helping improve the customer experience.
- Ensuring application monitoring and metrics are captured for all deployed assets.
- Enforcing quality and security requirements in release pipelines.
- Identifying areas of improvement in the environment and making recommendations on improvements1.
- Design, develop, and maintain scalable data pipelines and ETL processes using AWS services such as Glue, Lambda, and Redshift.
- Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver high-quality data solutions.
- Implement data integration and transformation processes to ensure data quality and consistency.
- Optimize and tune data pipelines for performance and cost-efficiency.
- Monitor and troubleshoot data pipeline issues to ensure data availability and reliability.
- Develop and maintain documentation for data pipelines, processes, and infrastructure.
- Stay up-to-date with the latest AWS services and best practices in data engineering.
- Leverage Apache Flink for real-time data processing and analytics, ensuring low-latency data handling.
- Employ Apache Kafka for stream processing and integrating data from various sources into the data pipelines.
Preferred Qualifications
- AWS Certified Developer – Associate or similar certification.
- Experience with microservices architecture.
- Knowledge of security best practices in cloud environments.