Company Description
👋🏼We're Nagarro.
We are a Digital Product Engineering company that is scaling in a big way! We build products, services, and experiences that inspire, excite, and delight. We work at scale across all devices and digital mediums, and our people exist everywhere in the world (17500+ experts across 39 countries, to be exact). Our work culture is dynamic and non-hierarchical. We are looking for great new colleagues. That's where you come in!
Job Description
Requirement:
- 7.5+ years of total experience in data operations, ETL development, and technical solutioning.
- Proven experience in leading large-scale data initiatives and RFP responses.
- Strong hands-on experience with ETL tools (e.g., Informatica, Talend, SSIS, DataStage) and data pipeline orchestration tools (e.g., Apache Airflow, Azure Data Factory).
- Exposure to multiple technology stacks including cloud platforms (AWS, Azure, GCP), databases (SQL, NoSQL), and big data ecosystems (Hadoop, Spark).
- Experience with monitoring tools (e.g., Splunk, Dynatrace, Prometheus, Grafana) and ITSM platforms (e.g., ServiceNow, JIRA).
- Experience with CI/CD, DevOps practices and monitoring tools for data environments. Proven track record of process automation and performance tuning in complex data landscapes.
- Excellent communication and stakeholder management skills.
Responsibility:
- Lead the technical and business solutioning of RFPs and client proposals related to data engineering, data operations, and platform modernization
- Collaborate with architecture, governance, and business teams to align on technical strategy and data management standards.
- Continuously evaluate emerging technologies and frameworks to modernize legacy systems and improve efficiency.
- Ready to lead the delivery, whenever required
- Mentor and guide a team of data engineers and operations analysts, fostering a culture of technical excellence and continuous improvement.
- Act as a primary liaison with business, support, and governance teams for operational matters.
- Contribute to maintain compliance with security, audit, and regulatory standards in operations.
- Drive automation and self-healing mechanisms to reduce manual interventions and improve system resilience.
- Drive process automation across data operations using scripts, tools, and workflow orchestration.
- Implement and enforce best practices including metadata management, lineage tracking, data quality monitoring, and master data management
Qualifications
Bachelor’s or master’s degree in computer science, Information Technology, or a related field.