What is Function Calling (Tool Use)?
The ability of an LLM to call external functions or APIs as part of generating a response, enabling it to interact with the real world.
Definition
Function calling (also called tool use) is a capability that allows LLMs to invoke predefined functions or APIs as part of their response generation. Instead of only producing text, the model can decide to call a function — search the web, query a database, execute code, send an email — and incorporate the result into its response. Function calling is what turns an LLM into an agent that can take real-world actions.
Why it matters
Function calling is what makes AI applications useful beyond chat. Without it, an AI can only generate text about a task. With it, the AI can actually do the task: look up a price, update a record, send a message, reserve a meeting. Every production AI application that does something relies on function calling at its core.
How it works
You define a set of available functions with names, descriptions, and parameter schemas (in JSON). You pass these to the LLM along with the user's message. The model decides whether to respond in text or call a function. If it calls a function, you execute it in your application code, pass the result back to the model, and it incorporates the result into its final response.
Examples in practice
Weather assistant
The user asks "Will it rain in London tomorrow?" The LLM calls a get_weather(city, date) function, receives the forecast data, and responds with a natural language answer based on real data.
CRM update agent
A sales AI receives a meeting notes transcript. It calls update_contact(), create_deal(), and schedule_followup() functions to update the CRM automatically from the conversation.
