Introduction
The future is being built today and Johnson Controls is making that future more productive, more secure and more sustainable. We are harnessing the power of cloud, AI/ML and Data analytics, the Internet of Things (IoT), and user design thinking to deliver on the promise of intelligent buildings and smart cities that connect communities in ways that make people’s lives and the world better. As the AI landscape expands to GenAI, we must continuously evolve and pivot to capitalize on these advancements and bring them through the maturity cycle to benefit our teams and our customers.
What You Will Do
- The OpenBlue Engineering, Data and Analytics team’s prime mission for Intelligence is to leverage data and advancing technologies to transform our products and our business. To this end, given the variety of data sources and platforms where data resides, there is a need to both design and create automated pipelines and make the data available to analysts and applications. This role resides in a dedicated AI and Intelligence team, whose charter includes building and deploying intelligence capabilities. The Principal Architect role bridges between the scientists and enabling engineers to bring the full solution together in a scalable, repeatable solution. The lifecycle for streamlining operations spans from Data supply strategy to Data discovery to Data Modeling to AI Model training and development to deployment of Intelligence in the Cloud as well as at the Edge.
- Taking solutions to production requires a combination of cloud engineering, data engineering, and software engineering. This entails building a unified and cohesive infrastructure that is generic enough to work for all use cases and modular enough to create particular workflows for special cases. And this is where you come in. We are looking for a talented Principal Architect with industry experience to contribute to foundational platform and data engineering with repeatability in mind. The Principal Architect will be a key member of our Data/Analytics/AI team, working with data engineers, data scientists, AI scientists, product managers, platform engineers, and domain experts.
- Solutions Architecture: This role will translate business requirements into reference architectures that help productionize Analytics solutions for Smart Buildings, keeping repeatability and scale in mind.
- Innovation and Proof of Concept: Be the first to experiment with new technologies, determine their primary utility, how they can integrate with other existing and new tools, and propose new/revised solutions as part of our portfolio. Work with Architecture and other Enterprise teams who are researching and experimenting with similar tools.
- Technical Writing: This role will translate data requirements for applications and distill them into a high-quality, well-thought-out, cohesive implementation plan. Coordinate with Product and engineering to understand timelines, help with build-vs-buy decisions.
How You Will Do It
- Own the Open Blue AI Engineering solutions architecture roadmap and execution strategy that aligns with partner product groups within JCI to ensure timely integration and delivery
- Work with product and engineering teams, architect and help productize end-to-end Analytics solutions accounting for the variability in data sources and collection policies, modeling frameworks for cloud and edge, and serving infrastructure.
- Work with data scientists, DevOps, data engineers/SMEs from domain to understand how data availability and quality affects model performance
- Create and streamline workflows that allow fast experimentation of Analytics solutions that need to operate on a large scale. This includes processes for good source control and branching methodologies, CI/CD, automated testing and model monitoring and updates.
- Lead by example and mentor team members by demonstrating how to write well-designed modular code following best software engineering principles and using widely adopted frameworks/libraries.
- Evaluate partners, open source and proprietary technologies and present recommendations to onboard potential vendors, automate data pipelines/workflows, data modelling for versioned experimentation, digital feedback and monitoring
- Architect and develop reference architecture for operationalizing Machine learning, LLM & Conversational AI solutions tailored to resolve unique and challenging use cases.
Required
What we look for
- BS in Computer Science/Electrical or Computer Engineering, or has a degree and demonstrated technical abilities in similar areas
- 7+ years of experience with technical solution architecture in one or more industry verticals with particular focus on data engineering and analytics
- 3+ years of experience with Cloud and IoT Solutions development
- 5+ years’ Experience with Python, NodeJS, SQL developing microservices using standard frameworks
- Experience providing technical leadership & Architectural improvements to improve the overall AI, ML,EdgeML,LLM model development & operationalization process by suggesting improvement in tools, process, and practices by working closely with the Data Scientists & MLOps engineers in every stage of the model development and deployment ( Data Pre-Processing, Feature Engineering, Model Problem formulation , Deployment & Observability)
- Experience architecting MLOps strategies, best practices, and standards to enhance AI ML model deployment and monitoring efficiency.
- Experience with design, deployment, and management of scalable and reliable infrastructure for model training and deployment using tools similar to Microsoft Azure Databricks, Azure Cloud Native services, MLFlow etc.
- Experience developing conversational AI agents using OpenAI, Assistant AI , Semantic Kernel , Azure Speech Service, Azure Translator, Azure Bot Framework, Prompt flow , Directline Speech
- Experience Architecting event driven architecture to support Microservices Platform which uses containerized deployments in cloud environment using message brokers, caches, queues, pub/sub concepts, Redpanda or similar tools, container technologies such as Kubernetes, Docker, must have experience with API-first design considering security, authentication and authorization.
- Experienced with mentoring technical AI/ML engineering talent on the team. Provide guidance and mentorship to junior MLOps engineers, fostering their professional growth and development.
- Experience mentoring and troubleshooting production incidents, providing technical inputs & solutions to resolve high priority/severity incidents.
- You will be expected to devise and implement AI strategies tailored to specific client needs, develop prototypes and proofs of concept (PoCs) to simplify complex integration points, and ensure the smooth integration of AI and data platforms into our clients’ technology ecosystem.
- Fluency in network concepts & implementation, firewall / routing, security, identity / AAD (e.g., app registrations etc.)
- Strong hands-on Infrastructure as Code skills with complex Terraform based frameworks (Full stack: from networking to databases / data lake configurations including central monitoring, Context: data platform)
- Strong DevOps skills: both Git & complex deployment pipelines (both IaC-level AND application/logical/data pipeline level/ DB Deployment)
- Experience working with message brokers, caches, queues, pub/sub concepts. Experience in working with Kafka, KsqlDB is a plus
- Understanding of AI / ML technologies with some experience in using Pytorch, TensorFlow or other frameworks is a plus
- Container experience using technologies such as Kubernetes, Docker, AKS
- Knowledgeable in the SCRUM/Agile development methodology
- Strong spoken and written communication skills.
Preferred Qualifications
- 10+ years of total experience technical solution consulting and sales in one or more industry verticals with particular focus on data engineering and analytics
- 5+ years of experience with Cloud and IoT Solutions development
- Experience in Azure Data Factory pipeline concepts and implementation (including monitoring, meta-data driven pipelines)
- Knowledge in Power BI, DAX Query
- Fluency in SQL, RDBMS, Data Modelling, (Modern) Data warehousing methodologies
- Experience working on developing LLMops solution for generative AI by working closely with engineers to develop a reference architecture and build on the design to support generative AI requirements
- Experience implementing and operating Snowflake centric solutions, understanding of complete data analytics stack and workflow, from ETL to data platform design to BI and analytics tools and having Proficiency in implementing data security measures, access controls, and design specifically within the Snowflake platform
- Subject matter expertise in technical solution engineering for at least one industry vertical such as Smart Buildings, Energy, Finance, Healthcare, Security/Surveillance, Retail, Smart Cities
- Cloud Certified: Azure Solutions Architect Expert, Azure AI Engineer Associate or similar certifications.
Awards And Recognitions
- Great Place to Work-Certified, Oct 2023 - Oct 2024
- Honored with a NASSCOM Enterprise Cloud Adoption Award for our commitment to leveraging cloud technology for sustainability.