You are viewing a free preview of this lesson.
Subscribe to unlock all 10 lessons in this course and every other course on LearningBro.
Function calling (also known as tool use) is one of the most powerful features of modern LLMs. It lets the model decide when and how to invoke external functions — enabling your AI application to interact with APIs, databases, and real-world systems.
Instead of generating a text response, the model can output a structured request to call a function you have defined. Your application then executes the function and returns the result to the model.
┌──────────┐ "What's the weather ┌──────────┐
│ │ in London?" │ │
│ User │─────────────────────────────▶│ LLM │
│ │ │ │
└──────────┘ └────┬─────┘
│
Function call: │
get_weather( │
city="London" │
) ▼
┌──────────┐
│ Your │
Execute │ Code │
function │ │
└────┬─────┘
│
Result: │
{"temp": 15, │
"condition": │
"cloudy"} ▼
┌──────────┐
│ LLM │
Generate │(with │
response │ result) │
└────┬─────┘
│
"It's 15°C and │
cloudy in │
London." ▼
┌──────────┐
│ User │
└──────────┘
Tools are defined using JSON Schema:
Subscribe to continue reading
Get full access to this lesson and all 10 lessons in this course.