LLM Building Blocks for Python Course

LLM Building Blocks for Python
Dive into LLM Building Blocks for Python, a concise 1.2-hour video course that equips you with everything you need to integrate large language models into your Python applications. You’ll learn to move beyond “text in → text out” by turning your prompts into structured data, orchestrating chat-style workflows, and building interactive prototypes. From rapid-fire notebook experiments to production-ready async pipelines and caching, this course gives you practical, code-first techniques for real-world LLM development.

What students are saying

I would encourage all my Python friends to buy this course. Michael Kennedy did an amazing job and it’s money well spent. I'm loving it! I'm totally unblocked [...] now! Onward!
-- Marty

Source code and course GitHub repository

github.com/talkpython/llm-building-blocks-for-python-course

What’s this course about and how is it different?

Most LLM tutorials focus on theory or standalone demos. This course shows you how to treat LLMs as reusable Python building blocks:

  • Vendor-Agnostic: Use Simon Willison’s llm library to swap between OpenAI, Anthropic, local models via Ollama, and more, without rewriting your code.
  • Structured Outputs: Learn schema-driven prompts with Pydantic to reliably parse JSON from the LLM.
  • Reactive Prototyping: Master Marimo’s dependency-graph notebook for rapid UI-driven exploration.
  • Production-Ready Workflows: Cover async requests, disk-based caching, performance measurement, and A/B testing so you can scale your LLM tasks.
  • Tool Spotlight: Compare smartfunc, Mirascope, and Instructor to find the right abstraction layer for your projects.

This isn’t just another “prompt engineering” overview. It’s a hands-on guide to embedding LLM capabilities directly into Python code, end to end.

What topics are covered

By the end of this course, you’ll be able to:

  • Set up and use Marimo for live, reactive notebook experiments
  • Install and configure the llm library and its plugins for multiple vendors
  • Craft prompts with Pydantic schemas to enforce structured JSON outputs
  • Build and manage chat conversations programmatically in Python
  • Orchestrate async LLM calls and understand concurrency limits
  • Implement disk caching to save tokens, speed up development, and cut costs
  • Measure classification accuracy and benchmark LLM vs. scikit-learn pipelines
  • Conduct A/B tests on prompts to iteratively refine model outputs
  • Explore higher-level tools like smartfunk, Mirascope, Ollama, and Instructor
  • Design small “apps” inside Marimo to automate tasks such as YouTube transcript summarization

Who Should Take This Course?

  • Python developers curious about adding LLM features to scripts, tools, or web apps
  • Data scientists wanting to prototype NLP workflows without deep ML expertise
  • DevOps/automation engineers looking to integrate AI-driven tasks into pipelines
  • Tech leads and architects evaluating LLM toolchains for production use
  • Educators and researchers who need structured LLM interactions in their code

You should already be comfortable writing Python scripts, using pip, and handling basic virtual environments.

Key Chapter Highlights

Chapter Why It Matters
Course Introduction Meet your instructor, understand course goals, and get your local environment, Marimo, uv, Git, ready.
Building Blocks Learn foundational tools, Marimo reactive notebooks, the llm library, Pydantic schemas, for robust prompt-to-data pipelines.
Performance and Measurements Discover async requests, disk caching, and accuracy benchmarking to scale LLM tasks cost-effectively.
Tools to Consider Compare higher-level abstractions, smartfunk, Mirascope, Ollama, Instructor, to streamline your workflow.
Conclusion Emphasize methodology, production pitfalls, and fallback strategies with hosted vs. open-source models.

Follow along with subtitles and transcripts

Each course comes with subtitles and full transcripts. The transcripts are available as a separate searchable page for each lecture. They also are available in course-wide search results to help you find just the right lecture.

Each course has subtitles available in the video player.

Who am I? Why should you take my course?

Who is Vincent D. Warmerdam?

My name is Vincent, nice to meet you. ;) I'm a senior data professional who worked as an engineer, researcher, team lead, and educator in the past. I'm well known for my PyData talks as well as many side projects for machine learning practitioners. In particular, he maintains calmcode.io, where people can learn how to code … calmly.

This course is delivered in very high resolution

Example of 1440p high res video

This course is delivered in 1440p (4x the pixels as 720p). When you're watching the videos for this course, it will feel like you're sitting next to the instructor looking at their screen.

Every little detail, menu item, and icon is clear and crisp. Watch the introductory video at the top of this page to see an example.

Money-Back Guarantee Details

We want every student to feel 100 % confident enrolling. If "LLM Building Blocks for Python" doesn’t meet your expectations, let us know within 15 days of purchase and we’ll refund you in full, no forms, no hassle, no hard feelings.

Simply email our support team (contact@talkpython.fm) with your registered email address, and your refund will be processed within several business days. We’re certain the speed gains and clean-code patterns you’ll learn are worth far more than the tuition, but you get to decide.

Corporate / Team Licenses

Empower your whole engineering team to build LLM-based apps in one cohesive curriculum.

  • Volume discounts start at 5 seats and scale up for larger groups.
  • Centralized billing and a single invoice for easy reimbursement.
  • Private progress dashboard so team leads can track completion.

Interested? Email sales@talkpython.fm with the number of learners and your preferred billing currency, and we’ll craft a package that fits your organization.

Tech Requirements & Setup

  • Python 3.9+ installed on macOS, Linux, or Windows
  • A terminal with Git, pip, and a virtual-env tool (venv, uv, etc.)
  • Marimo notebook (pip install marimo) for reactive prototyping
  • llm library (pip install llm python-env) plus any vendor plugin (e.g. pip install llm-anthropic)
  • Valid LLM API keys (OpenAI, Anthropic, Ollama, or local models) stored in a .env file
  • Disk-cache backend (pip install diskcache) for offline caching
  • Optional: Ollama CLI for local LLM hosting; Mirascope or Instructor for higher-level abstractions

Ready to level-up your Python projects with LLM power?

Enroll now for instant access to all video lessons, code samples, and notebooks, no fluff, just building blocks.

Course Outline: Chapters and Lectures

4:33
show 1:19
show 2:47
show 0:27
21:21
show 4:35
show 2:24
show 2:56
show 1:56
show 2:55
show 2:25
show 2:03
show 2:07
29:22
show 1:37
show 2:21
show 2:59
show 2:19
show 2:40
show 3:34
show 2:12
show 2:37
show 2:19
show 4:11
show 2:33
12:49
1:25
show 2:41
show 2:52
show 2:01
show 3:50
2:38
show 2:38
Buy for $19 + tax Bundle and save 79% Team Gift

Questions? Send us an email: contact@talkpython.fm

Talk Python's Mastodon Michael Kennedy's Mastodon