Ingram Micro touches 80% of the technology you use every day with our focus on Technology Solutions, Cloud, and Commerce and Lifecycle Solutions. With $50 billion in revenue, we have become the world’s largest technology distributor with operations in 64 countries and more than 35,000 associates.
Experience:
- Total IT Experience: 8 to 15 years
- AI Experience: 3 to 5 years
- Gen AI Experience: 1 to 2 years
Key Responsibilities:
- Design, Architect and Develop AI Solutions: Develop scalable and reliable AI solutions on AWS or any cloud platform.
- AI Services Expertise: Proficient in AI services including EC2, Auto Scaling, S3, RDS, DynamoDB, Sagemaker, ML Studio, AI Studio, Vertex AI, IBM Watson
- AI Architecture Management: Implement and manage AI architecture, tools, techniques, and frameworks.
- Data Handling and Analysis: Expertise in RAG, Knowledge DB (Vector DB), Embedding, Indexing, Knowledge Graph, Cosine Similarity, and Searching.
- Gen AI Solutions Development: Design, develop, train/tune/transfer learning, validate, deploy, manage/monitor, and optimize Gen AI solutions.
- Transformer Architecture: Understanding of Transformer Architecture, including GAN, VAE, and VE.
- LLM Fine-Tuning and Prompt Engineering: Expertise in fine-tuning large language models (LLMs) for specific tasks, using prompt engineering techniques such as few-shot learning and prompt chaining, and optimizing prompts.
- Multimodal AI Techniques: Knowledge of multimodal AI techniques, including vision-language models (e.g., CLIP, DALL-E), speech-language models, and multimodal fusion techniques.
- Responsible AI Practices: Understanding of responsible AI practices, techniques for detecting and mitigating bias in LLMs, and strategies for promoting fairness and transparency.
- LLM Evaluation Metrics: Familiarity with evaluation metrics specific to LLMs, such as perplexity, BLEU score, and metrics for coherence, consistency, and factuality.
- Production Deployment of LLMs: Experience in deploying LLMs in production environments, including containerization, scaling, and monitoring techniques for LLM models and pipelines. Understanding of Nvidia NIM, Open AI, Huggin Face etc.
- Security and Privacy: Knowledge of security and privacy considerations when working with LLMs, such as data privacy, model extraction attacks, and techniques for secure model deployment.
- Continual Learning for LLMs: Understanding of continual learning techniques for LLMs, such as rehearsal, replay, and parameter isolation, to enable efficient adaptation to new data and tasks.
- Model Creation and Evaluation: Creating models using training and test datasets, evaluating models using algorithms, and performing hyperparameter tuning.
- Machine Learning: Proficient in sci-kit learn for supervised and unsupervised ML, including NLP, recommender systems, anomaly detection, and time series.
- Custom AI Solutions: Experience with custom object detection, speech recognition, image classification, and recognition.
- Deep Learning: Implementing deep learning scenarios on NVIDIA or GPU-based hardware. Build, train and test SLM/LLM/Model
- LLM/Model Selection and Evaluation:
- Select and evaluate LLMs based on use case requirements, cost, performance, and efficiency.
- Build and apply evaluation and accuracy metrics to ensure optimal model performance.
- Data Engineering and Management:
- Design and manage data pipelines for AI applications using AI services like S3, Redshift, and Kinesis.
- Experience with Python, SQL, and knowledge of Java.
- cleaning data, transforming data, preparing data
- DevOps/LLMOps Skills:
- Automate and streamline deployment pipelines for AI applications using CodePipeline, CodeBuild, and CodeDeploy.
- Implement configuration management and infrastructure as code (IAC) using CloudFormation and Elastic Beanstalk.
- Security and Compliance:
- Ensure security and compliance of AI solutions on AWS using IAM, KMS, WAF, and other AWS security services.
- Implement VPC security best practices and maintain security compliance and governance on AWS.
- Development and Deployment:
- Develop, train, and deploy machine learning models on AWS.
- Implement AI cognitive services, embedding/vector DB, and Gen AI enterprise integration (e.g., Open AI, Hugging Face).
- AI and Gen AI:
- Expertise in machine learning and deep learning models, including Transformers and GPT.
- Experience with AI cognitive services and embedding/vector DB.
- Gen AI Enterprise Integration (Open AI, Hugging face, SAP, DFSC etc.)
- Gen AI Use Cases: Conversational, Summarization, Extraction, Document Processing, Task Automation, Enterprise Automation, RPA etc. in different Domains like Sales, Services, Customer Operations, Manufacturing, Logistics etc.
- Organization Change Management (OCM)
- Set-up AI CoE and AI Factory
- Processes, Templates, Formats, Org Structure
- Organization Change Management (OCM)
- LLMOps/DevOps:
- CI/CD automation, configuration management, and cloud solutions deployment.
- Experience with AWS Lambda, API Gateway, ECS, and monitoring tools like CloudWatch.
- Data Engineering:
- Design data pipelines, storage solutions, and database management.
- Utilize analytics tools like Glue, Athena, and QuickSight.
- Data Management using LLM.
- Security and Compliance:
- Proficiency in IAM, KMS, and implementing security compliance.
- Project Management and Documentation:
- Lead AI projects using Agile methodology and Gen AI program management.
- Prepare comprehensive documentation, including blueprints, explainable AI, and metrics.
- AI Strategy, Responsible AI, Risk Management Framework
Languages:
- Proficiency in Python, PyTorch, TensorFlow, Langchain (Agents, Tools, Chain etc.) StreamLit/ChainLit, SQL, and working knowledge of Java.
Certifications (Any of the followings or Any AI Certification):
- AWS Machine Learning Specialty
- AWS Data Engineer Associate
- AWS Certified AI Practitioner
Additional Responsibilities:
- AI CoE and AI Factory Setup: Establish processes, templates, and organizational structures.
- Organizational Change Management (OCM): Manage change processes to facilitate AI adoption.
- Training and Skill Development: Conduct training sessions and keep up-to-date with the latest Gen AI developments.
- Customer and Vendor Management: Ensure vendors adhere to Ingram's Responsible AI Framework and provide support for AI implementations.
Desired Skills:
- Strong designing, coding/development, documentation, discussion, and training skills.
- Up-to-date knowledge of the latest developments in the Gen AI world.
This role is crucial for driving Ingram Micro's AI and Gen AI initiatives, ensuring high-quality, scalable solutions that meet both internal and external requirements.
Ingram Micro is committed to creating a diverse environment and is proud to be an equal opportunity employer. We are dedicated to fostering an inclusive and accessible environment where all associates are valued, respected, and supported. We are highly driven by our tenets of success: Results, Integrity, Imagination, Responsibility, Courage, and Talent
- This is not a complete listing of the job duties. It’s a representation of the things you will be doing, and you may not perform all of these duties.