Position Overview
Birlasoft is seeking a visionary Senior/Lead Cloud and Data Architect with proven leadership capabilities to drive enterprise-level data solutions. The ideal candidate will have expertise in cloud platforms (AWS, Azure, or GCP), mandatory experience in Databricks or Snowflake, and a strong foundation in data warehousing, data modeling, DevOps, and DataOps. Additionally, the role demands a strategic leader with an understanding of machine learning (ML) concepts and the ability to function as an Enterprise Architect, providing guidance and alignment across diverse teams and initiatives.
Key Responsibilities
Leadership and Collaboration:
- Lead and inspire cross-functional teams, including data engineers, data scientists, ML engineers, and BI analysts, fostering a culture of collaboration, innovation, and continuous improvement.
- Act as the central technical point of contact for enterprise-wide data and analytics initiatives, ensuring alignment with business objectives.
- Collaborate with senior leadership and business stakeholders to define and deliver on data strategy, ensuring scalability and alignment with enterprise goals.
- Mentor and upskill team members, providing guidance on best practices, emerging technologies, and professional development.
Enterprise Architecture:
- Develop and maintain enterprise-level architectural blueprints for cloud and data platforms, ensuring interoperability and scalability.
- Create a cohesive architecture that integrates data platforms, ML systems, and business intelligence tools while adhering to governance and compliance requirements.
- Provide strategic input on technology roadmaps, ensuring alignment with organizational vision and future-proofing investments.
- Evaluate and recommend emerging technologies and frameworks to enhance enterprise data capabilities.
Cloud and Data Platform Engineering:
- Design and implement cutting-edge cloud architectures using AWS, Azure, or GCP, focusing on scalability, security, and cost optimization.
- Build and optimize modern data platforms using Databricks or Snowflake for advanced analytics and real-time data processing.
- Lead the development of end-to-end data pipelines, ensuring robust integration between data sources, platforms, and analytics tools
Data and ML Integration:
- Collaborate with ML teams to deploy machine learning pipelines and operationalize AI models within enterprise systems.
- Provide architectural support for ML initiatives, including data preparation, feature engineering, and model lifecycle management.
- Enable seamless integration of ML and analytics into business workflows, enhancing decision-making and operational efficiency.
Data Warehousing and Modeling:
- Architect enterprise data warehouses with robust data models, ensuring high performance and reliability for analytics workloads.
- Lead the development of advanced data models that cater to both analytical and operational requirements, emphasizing scalability and data quality.
DevOps and DataOps Practices:
- Establish and enforce DevOps and DataOps pipelines to automate deployments, enhance agility, and ensure operational excellence.
- Drive initiatives for continuous improvement in data delivery, ensuring high availability, data quality, and scalability.
Governance, Security, and Compliance:
- Define and enforce data governance policies, including data security, lineage, and compliance with industry regulations.
- Implement robust security measures to protect sensitive data across platforms, adhering to privacy standards such as GDPR and CCPA.
Qualifications:
- Bachelor’s or Master’s degree in Computer Science, Data Engineering, or related field.
- 10+ years in data architecture, cloud platforms, and advanced analytics.
- Hands-on experience with at least two cloud platforms (AWS, Azure, or GCP) is mandatory.
- Expertise in Databricks or Snowflake is required.
- Comprehensive understanding of data warehousing, data modeling, and data integration.
- Experience implementing ML pipelines or integrating ML solutions with enterprise systems.
- Proven track record in enterprise architecture, aligning technology solutions with business goals.
- Technical Skills:
- Proficient in programming languages such as Python, SQL, or Scala.
- Familiarity with ML frameworks like TensorFlow, PyTorch, or Scikit-learn is a plus.
- Deep understanding of big data frameworks, distributed systems, and real-time processing.
- Experience with DevOps tools like Terraform, Kubernetes, or CI/CD platforms.
Leadership Skills:
- Strong strategic thinking with the ability to define and execute large-scale data and cloud strategies
- Exceptional communication skills to engage with both technical and non-technical stakeholders.
- Proven ability to lead diverse teams across geographies and time zones, ensuring high performance and cohesion.
- Capable of making data-driven decisions while managing competing priorities and deadlines.
Preferred Qualifications:
- Certifications in AWS, Azure, GCP, Databricks, or Snowflake.
- Experience working in enterprise-scale environments with complex data landscapes.
- Exposure to industry-specific data challenges in domains like finance, insurance, or healthcare.
This role offers the opportunity to lead transformative data initiatives at an enterprise level, combining cutting-edge cloud and data technologies with leadership and innovation. If you have a passion for building scalable data solutions and leading diverse teams to success, we encourage you to apply..