You are viewing a free preview of this lesson.
Subscribe to unlock all 10 lessons in this course and every other course on LearningBro.
Function calling (also known as tool use) is one of the most powerful features of modern LLMs. It lets the model decide when and how to invoke external functions — enabling your AI application to interact with APIs, databases, and real-world systems.
Instead of generating a text response, the model can output a structured request to call a function you have defined. Your application then executes the function and returns the result to the model.
┌──────────┐ "What's the weather ┌──────────┐
│ │ in London?" │ │
│ User │─────────────────────────────▶│ LLM │
│ │ │ │
└──────────┘ └────┬─────┘
│
Function call: │
get_weather( │
city="London" │
) ▼
┌──────────┐
│ Your │
Execute │ Code │
function │ │
└────┬─────┘
│
Result: │
{"temp": 15, │
"condition": │
"cloudy"} ▼
┌──────────┐
│ LLM │
Generate │(with │
response │ result) │
└────┬─────┘
│
"It's 15°C and │
cloudy in │
London." ▼
┌──────────┐
│ User │
└──────────┘
Tools are defined using JSON Schema:
tools = [
{
"type": "function",
"function": {
"name": "get_weather",
"description": "Get the current weather for a city",
"parameters": {
"type": "object",
"properties": {
"city": {
"type": "string",
"description": "The city name, e.g. 'London'",
},
"units": {
"type": "string",
"enum": ["celsius", "fahrenheit"],
"description": "Temperature unit",
},
},
"required": ["city"],
},
},
},
]
import json
from openai import OpenAI
client = OpenAI()
# Your actual function implementations
def get_weather(city: str, units: str = "celsius") -> dict:
# In a real app, call a weather API
return {"city": city, "temp": 15, "condition": "cloudy", "units": units}
def get_time(timezone: str) -> dict:
return {"timezone": timezone, "time": "14:30"}
# Map function names to implementations
available_functions = {
"get_weather": get_weather,
"get_time": get_time,
}
def run_with_tools(user_message: str) -> str:
messages = [{"role": "user", "content": user_message}]
response = client.chat.completions.create(
model="gpt-4o-mini",
messages=messages,
tools=tools,
)
message = response.choices[0].message
# If the model wants to call a function
while message.tool_calls:
messages.append(message)
for tool_call in message.tool_calls:
fn_name = tool_call.function.name
fn_args = json.loads(tool_call.function.arguments)
# Execute the function
result = available_functions[fn_name](**fn_args)
# Add the result to the conversation
messages.append({
"role": "tool",
"tool_call_id": tool_call.id,
"content": json.dumps(result),
})
# Get the next response
response = client.chat.completions.create(
model="gpt-4o-mini",
messages=messages,
tools=tools,
)
message = response.choices[0].message
return message.content
Subscribe to continue reading
Get full access to this lesson and all 10 lessons in this course.