Skip to content

The course equips you with the skills to deploy Large Language Model (LLM)-based applications into production using serverless technology with Amazon Bedrock.

Notifications You must be signed in to change notification settings

ksm26/Serverless-LLM-apps-with-Amazon-Bedrock

Repository files navigation

💻 Welcome to the "Serverless LLM apps with Amazon Bedrock" course! Instructed by Mike Chambers, Developer Advocate for Generative AI at AWS, this course will teach you how to deploy Large Language Model (LLM)-based applications into production using serverless technology with Amazon Bedrock.

Course Website: 📚deeplearning.ai

Course Summary

In this course, you'll learn the ins and outs of deploying LLM-based applications using serverless technology. Here's what you can expect to learn and experience:

  1. 🛠 Prompting and Customizing LLM Responses: Learn how to prompt and customize your LLM responses using Amazon Bedrock.
  2. 🔊 Summarizing Audio Conversations: Summarize audio conversations by transcribing audio files and passing the transcription to an LLM.
  3. ⚙️ Deploying Event-driven Audio Summarizer: Deploy an event-driven audio summarizer that runs as new audio files are uploaded using a serverless architecture.

Key Points

  • 🧠 Learn how to prompt and customize your LLM responses using Amazon Bedrock.
  • 🎙 Summarize audio conversations by transcribing audio files and passing the transcription to an LLM.
  • ⚡ Deploy an event-driven audio summarizer using a serverless architecture.

About the Instructor

🌟 Mike Chambers is a Developer Advocate for Generative AI at AWS and co-instructor of Generative AI with Large Language Models. With extensive experience, Mike will guide you through deploying serverless LLM applications with Amazon Bedrock.

🔗 To enroll in the course or for further information, visit deeplearning.ai.