Course Overview
In this course, you learn about the different challenges that arise when productionizing generative AI-powered applications versus traditional ML. You will learn how to manage experimentation and tuning of your LLMs, then you will discuss how to deploy, test, and maintain your LLM-powered applications. Finally, you will discuss best practices for logging and monitoring your LLM-powered applications in production.
Who should attend
Developers and machine learning engineers who wish to operationalize Gen AI-based applications
Prerequisites
Completion of Introduction to Developer Efficiency with Gemini on Google Cloud (IDEGC) or equivalent knowledge.
Course Objectives
- Describe the challenges in productionizing applications using generative AI.
- Manage experimentation and evaluation for LLM-powered applications.
- Productionize LLM-powered applications.
- Implement logging and monitoring for LLM-powered applications.