Function Tools with AIFunctionFactory in Microsoft Agent Framework
Tool calling is what separates a chatbot from an agent. A chatbot responds with knowledge it was trained on. An agent uses tools to retrieve live data, execute actions, call APIs, and interact with the world. The microsoft agent framework aifunctionfactory tools feature makes tool registration as simple as passing a delegate -- no attributes, no base classes, no plugin scaffolding required. In the Microsoft Agent Framework in C#, AIFunctionFactory.Create() is the entry point for turning any C# method into a callable tool. This article covers how it works, how to register tools with your agent, how the invocation flow operates, sync vs async tools, parameter descriptions for LLM guidance, error handling in tools, and a brief comparison to Semantic Kernel's approach.
MAF is in public preview at version 1.0.0-rc1. The Microsoft Agent Framework tool calling API described here reflects the current shape and may evolve before GA.
What Is AIFunctionFactory?
AIFunctionFactory is the utility class in Microsoft.Extensions.AI that converts C# methods into AIFunction objects -- the type the Microsoft Agent Framework uses to represent callable tools. An AIFunction wraps a delegate and carries the metadata the LLM needs to understand it: the function's name, description, and a JSON schema for its parameters.
The key method is AIFunctionFactory.Create(). It accepts any Delegate (a lambda, static method, instance method, or local function) and returns an AIFunction ready to be registered with an agent.
AIFunctionFactory.Create() performs automatic schema generation. It introspects the delegate's parameter types and names, infers JSON Schema types from them, and optionally incorporates description metadata from [Description] attributes or overrides you provide. The LLM uses this schema to understand when and how to invoke the tool.
Critically -- you do not need any MAF-specific attributes in your code. No [KernelFunction], no [SKFunction], no inheritance from a plugin base class. Your tool is just a method.
Creating Your First Tool with AIFunctionFactory.Create()
The simplest case -- and the most common one you'll encounter when working with function tools with AIFunctionFactory in Microsoft Agent Framework -- is a static method with a clear name:
using Microsoft.Agents.AI;
using Microsoft.Extensions.AI;
// A simple tool: a static method
static string GetCurrentWeather(string city) => $"Sunny and 72°F in {city}";
// Create an AIFunction from it
AIFunction weatherTool = AIFunctionFactory.Create(GetCurrentWeather);
// Register it with an agent
IAIAgent agent = chatClient.AsAIAgent(
instructions: "You are a weather assistant. Use the GetCurrentWeather tool to answer weather questions.",
tools: new[] { weatherTool });
// The agent now has the tool. Ask it something that requires the tool.
AgentResponse response = await agent.RunAsync("What's the weather like in Seattle?");
Console.WriteLine(response.Text);
// Output: "The weather in Seattle is sunny and 72°F."
When the LLM decides it needs weather data, it returns a tool call request. ChatClientAgent detects the request, calls GetCurrentWeather("Seattle"), appends the result, and re-invokes the model. The model then composes the final user-facing response. You write none of that loop.
Tool Parameter Descriptions for Better LLM Guidance
The LLM decides when to call a tool based on the tool's name, description, and parameter schema. Well-described tools get invoked correctly more often. Poorly named tools with no descriptions get ignored or called with wrong arguments.
Use System.ComponentModel.Description attributes on parameters and methods to provide the LLM with context:
using System.ComponentModel;
using Microsoft.Extensions.AI;
[Description("Gets the current weather conditions for a given city.")]
static string GetCurrentWeather(
[Description("The city name to get weather for, e.g. 'Seattle, WA'")] string city,
[Description("Unit system: 'celsius' or 'fahrenheit'")] string unit = "fahrenheit")
{
return unit == "celsius"
? $"22°C and partly cloudy in {city}"
: $"72°F and partly cloudy in {city}";
}
AIFunction weatherTool = AIFunctionFactory.Create(GetCurrentWeather);
The [Description] on the method becomes the tool's description in the schema. The [Description] on each parameter becomes the property description in the parameter schema. Both are passed to the LLM as part of the function definition.
Without descriptions:
- The LLM sees
GetCurrentWeather(city: string, unit: string)-- functional but ambiguous - It may pass full country names when abbreviations are expected, or omit the unit parameter
With descriptions:
- The LLM understands exactly what
cityexpects and thatunithas specific valid values - Call accuracy improves significantly
Invest time in writing clear descriptions for your tools, especially for parameters that have non-obvious formats or enumerated valid values. This is analogous to writing good API contracts -- the description is the contract between your tool and the model.
Sync vs. Async Tools
AIFunctionFactory.Create() handles both synchronous and asynchronous methods. Prefer async tools for any I/O-bound work (API calls, database queries, file reads) to avoid blocking threads during tool execution.
Synchronous Tool
using System.ComponentModel;
using Microsoft.Extensions.AI;
[Description("Converts a temperature from Celsius to Fahrenheit.")]
static double CelsiusToFahrenheit(
[Description("Temperature in Celsius")] double celsius)
=> (celsius * 9.0 / 5.0) + 32.0;
AIFunction tempTool = AIFunctionFactory.Create(CelsiusToFahrenheit);
Sync tools are appropriate for pure computation with no I/O. The ChatClientAgent tool loop handles both sync and async tools identically from the registration perspective.
Asynchronous Tool
using System.ComponentModel;
using Microsoft.Extensions.AI;
using System.Net.Http.Json;
[Description("Fetches the current price of a cryptocurrency by symbol.")]
static async Task<string> GetCryptoPrice(
[Description("Cryptocurrency symbol, e.g. 'BTC', 'ETH'")] string symbol,
HttpClient httpClient)
{
try
{
// Example: replace with a real provider
var url = $"https://api.example.com/prices/{symbol.ToUpperInvariant()}";
var result = await httpClient.GetFromJsonAsync<PriceResult>(url);
return result is not null
? $"{symbol.ToUpperInvariant()}: ${result.Price:F2} USD"
: $"Price for {symbol} not available.";
}
catch (Exception ex)
{
return $"Failed to retrieve price for {symbol}: {ex.Message}";
}
}
record PriceResult(decimal Price);
Async tools are the right choice for any tool that does network I/O, database access, or file system operations. The tool invocation loop in ChatClientAgent awaits async tools correctly -- no extra configuration needed.
Registering Multiple Tools
You'll often register several tools with a single agent. Pass an array (or any IEnumerable<AIFunction>) to AsAIAgent():
using System.ComponentModel;
using Microsoft.Agents.AI;
using Microsoft.Extensions.AI;
[Description("Returns the current UTC date and time as a string.")]
static string GetCurrentDateTime() => DateTime.UtcNow.ToString("O");
[Description("Calculates the number of days between two ISO 8601 date strings.")]
static int DaysBetween(
[Description("Start date in ISO 8601 format, e.g. '2024-01-01'")] string startDate,
[Description("End date in ISO 8601 format, e.g. '2024-12-31'")] string endDate)
{
var start = DateTimeOffset.Parse(startDate);
var end = DateTimeOffset.Parse(endDate);
return (int)Math.Abs((end - start).TotalDays);
}
[Description("Looks up the definition of a programming term.")]
static string DefineTerm(
[Description("The programming term to define, e.g. 'closure', 'monad', 'idempotent'")] string term)
{
return term.ToLowerInvariant() switch
{
"closure" => "A closure is a function that captures variables from its enclosing scope.",
"idempotent" => "An operation is idempotent if applying it multiple times produces the same result as once.",
_ => $"No definition found for '{term}'."
};
}
var tools = new[]
{
AIFunctionFactory.Create(GetCurrentDateTime),
AIFunctionFactory.Create(DaysBetween),
AIFunctionFactory.Create(DefineTerm),
};
IAIAgent agent = chatClient.AsAIAgent(
instructions: """
You are a developer productivity assistant.
Use tools to answer questions about dates and programming terminology.
""",
tools: tools);
The LLM selects which tool to invoke based on the user's prompt and the tool descriptions. In a single turn, the model may invoke multiple tools in sequence if the response requires data from several sources.
The Tool Invocation Flow
Understanding the tool invocation flow helps you reason about latency, costs, and potential failure modes.
- User sends a prompt --
RunAsync("What's the date today and how many days until New Year's Eve?", session) - Agent sends to LLM -- the full message history plus the current prompt plus all tool schemas are sent to the model
- LLM returns tool call requests -- the model responds with a message requesting
GetCurrentDateTime()andDaysBetween(today, "2024-12-31") - ChatClientAgent executes tools -- both tools are executed (possibly in parallel, depending on the model's response format)
- Results appended to history -- the tool results are added as tool messages in the conversation
- LLM re-invoked -- the full history including tool results is sent back to the model
- LLM returns final response -- the model composes a natural language response using the tool data
- AgentResponse returned -- the final text response is returned to your caller
This cycle can repeat multiple times if the model needs to chain tool calls. For example, a model might call GetCurrentDateTime() first, use that result to compute a date range, then call DaysBetween() with the computed values.
Each iteration costs tokens and latency. Design your tools to be efficient and focused. A tool that does too much in one call is harder for the model to use correctly than two focused tools that each do one thing.
Error Handling in Tools
Tools can fail. A network call might time out, a database query might throw, or input validation might reject a malformed argument. How you handle errors in tools affects how gracefully the agent recovers.
The recommended pattern is to catch exceptions inside the tool and return an error string rather than letting the exception propagate:
using System.ComponentModel;
using Microsoft.Extensions.AI;
[Description("Fetches weather data for a city from an external API.")]
static async Task<string> GetWeatherData(
[Description("City name")] string city,
HttpClient httpClient)
{
if (string.IsNullOrWhiteSpace(city))
return "Error: city name cannot be empty.";
try
{
var response = await httpClient.GetAsync($"https://api.weather.example.com/{Uri.EscapeDataString(city)}");
if (!response.IsSuccessStatusCode)
return $"Error: weather service returned {(int)response.StatusCode} for '{city}'.";
var content = await response.Content.ReadAsStringAsync();
return content;
}
catch (HttpRequestException ex)
{
return $"Error: could not reach the weather service -- {ex.Message}";
}
catch (TaskCanceledException)
{
return "Error: weather service request timed out.";
}
}
When a tool returns an error string, the LLM receives the error as the tool result. A well-instructed agent will relay the error to the user in a helpful way -- "I'm unable to retrieve weather data for that city at this time" -- rather than returning a confusing raw exception message.
If an unhandled exception propagates from a tool, ChatClientAgent will surface it as an exception from RunAsync. Your caller should handle it with a try/catch. The decorator pattern for agents is useful here -- a retry or fallback decorator wrapping IAIAgent can catch tool execution failures and return a graceful error response without spreading error-handling logic throughout the application.
Providing Tool Names Explicitly
By default, AIFunctionFactory.Create() uses the method name as the tool name. For lambdas or anonymous functions, the name may be unhelpful. Use the overload that accepts explicit name and description:
using Microsoft.Extensions.AI;
// Lambda with explicit name and description
var searchTool = AIFunctionFactory.Create(
(string query) => $"Search results for: {query}",
name: "SearchKnowledgeBase",
description: "Searches the internal knowledge base for articles matching the given query.");
// Reuse a helper from another namespace under a cleaner tool name
var formatTool = AIFunctionFactory.Create(
MyHelpers.FormatAsCurrency,
name: "FormatCurrency",
description: "Formats a decimal number as a currency string in USD.");
Explicit names and descriptions are especially important when:
- You're using lambdas (which don't have meaningful method names)
- You want to present a cleaner API surface to the model
- Multiple tools have similar method names in different classes
Comparing AIFunctionFactory to Semantic Kernel's [KernelFunction]
Working with function tools with AIFunctionFactory in Microsoft Agent Framework is intentionally simpler than Semantic Kernel's approach. If you've used Semantic Kernel, the function-calling pattern there looks different:
// Semantic Kernel approach
public class WeatherPlugin
{
[KernelFunction]
[Description("Gets the current weather for a city.")]
public string GetWeather(string city) => $"Sunny and 72°F in {city}";
}
// Register with kernel
kernel.Plugins.AddFromType<WeatherPlugin>();
SK requires:
- A plugin class
- The
[KernelFunction]attribute on each tool method - Registration through the kernel's plugin system
MAF's AIFunctionFactory approach requires:
- Just a method (static, instance, lambda, or local function)
AIFunctionFactory.Create()to wrap it- Registration via the
toolsparameter ofAsAIAgent()
MAF is simpler for the common case. There are no attributes to remember, no plugin class scaffolding, and no dependency on the SK plugin infrastructure. If you have existing helper methods you want to expose as tools, you wrap them directly -- no modification to the original code needed.
SK's plugin system provides additional capabilities that MAF doesn't: plugin import from OpenAPI specs, semantic function templates, and integration with the broader SK planner. If you need those features, SK is the right choice. For straightforward tool calling, MAF's AIFunctionFactory is cleaner and requires less boilerplate.
The automatic dependency injection ecosystem supports both approaches -- your tools can receive dependencies via constructor injection if you use instance methods on a class that is itself injected, or via method parameters if AIFunctionFactory.Create() supports DI parameter injection in your version.
A Complete Tool-Calling Example
Here's a full example combining multiple tools, error handling, and a multi-turn session:
using System.ComponentModel;
using Microsoft.Agents.AI;
using Microsoft.Extensions.AI;
using OpenAI;
var apiKey = Environment.GetEnvironmentVariable("OPENAI_API_KEY")!;
IChatClient chatClient = new OpenAIClient(new ApiKeyCredential(apiKey))
.GetChatClient("gpt-4o-mini")
.AsIChatClient();
// Define tools
[Description("Gets the current UTC date and time.")]
static string GetUtcNow() => DateTime.UtcNow.ToString("f");
[Description("Calculates the square root of a non-negative number.")]
static string Sqrt(
[Description("A non-negative number")] double n)
{
if (n < 0) return "Error: cannot compute square root of a negative number.";
return $"√{n} = {Math.Sqrt(n):F4}";
}
[Description("Looks up a C# keyword and returns a brief explanation.")]
static string ExplainCSharpKeyword(
[Description("A C# keyword such as 'async', 'yield', 'ref', 'record', 'with'")] string keyword)
{
return keyword.ToLowerInvariant() switch
{
"async" => "'async' marks a method as asynchronous, enabling the use of 'await' inside it.",
"yield" => "'yield return' lazily produces values in an iterator method.",
"ref" => "'ref' passes a variable by reference, allowing the callee to modify the caller's variable.",
"record" => "'record' defines an immutable reference type with value-based equality.",
"with" => "'with' creates a copy of a record with specified properties modified.",
_ => $"No explanation available for '{keyword}'."
};
}
var tools = new[]
{
AIFunctionFactory.Create(GetUtcNow),
AIFunctionFactory.Create(Sqrt),
AIFunctionFactory.Create(ExplainCSharpKeyword),
};
var agent = chatClient.AsAIAgent(
instructions: """
You are a developer assistant with access to utility tools.
Use tools when they can help answer the user's question accurately.
""",
tools: tools);
var session = await agent.CreateSessionAsync();
var questions = new[]
{
"What is the current time?",
"What's the square root of 144?",
"Can you explain the 'record' keyword in C#?",
"And what about 'yield'?"
};
foreach (var question in questions)
{
Console.WriteLine($"User: {question}");
AgentResponse response = await agent.RunAsync(question, session);
Console.WriteLine($"Agent: {response.Text}");
Console.WriteLine();
}
Run this and you'll see the agent use tools for the first three questions (time, square root, keyword explanation) and use session memory for the fourth ("And what about 'yield'?" -- the agent knows "what about" refers to C# keyword explanation from context).
FAQ
What is AIFunctionFactory in Microsoft Agent Framework?
AIFunctionFactory is a utility class in Microsoft.Extensions.AI that converts C# methods into AIFunction objects. An AIFunction carries the method delegate, its name, description, and a JSON Schema for its parameters. MAF agents use AIFunction instances as tools that the LLM can invoke during response generation.
Do I need any special attributes to use AIFunctionFactory?
No. AIFunctionFactory.Create() accepts any delegate -- a static method, instance method, local function, or lambda -- without requiring any special attributes. The [Description] attribute from System.ComponentModel is optional but strongly recommended for providing the LLM with meaningful context about what each tool does and what its parameters expect.
How does the agent decide when to call a tool?
The LLM decides based on the user's prompt, the tool descriptions, and the agent's instructions. When the model determines that calling a tool would help answer the prompt, it returns a tool call request in its response. ChatClientAgent intercepts this, executes the tool, appends the result, and re-invokes the model. This loop continues until the model produces a final text response.
How is AIFunctionFactory different from Semantic Kernel's [KernelFunction]?
SK's [KernelFunction] requires annotating methods in a plugin class and registering that class with the kernel. MAF's AIFunctionFactory.Create() wraps any delegate directly -- no attributes, no plugin class, no kernel required. MAF is simpler for the common tool-calling use case. SK provides additional capabilities (OpenAPI import, semantic functions, planners) for more complex orchestration needs.
Can tools take complex object parameters?
Yes, with caveats. AIFunctionFactory.Create() generates JSON Schema from parameter types. Simple types (string, int, double, bool) and records with simple properties work well. Complex nested types can work but produce more elaborate schemas that may be harder for the model to populate correctly. Prefer simple, flat parameter types for best reliability with current LLMs.
What happens if a tool throws an unhandled exception?
The exception propagates from ChatClientAgent.RunAsync() to your caller. You should either handle exceptions inside the tool (catch and return an error string) or wrap the RunAsync call with a try/catch. The safest pattern is to catch exceptions inside the tool itself and return a descriptive error string -- this allows the LLM to relay the error gracefully rather than crashing the agent call.
Can I use async tools with AIFunctionFactory?
Yes. Pass an async method or a method returning Task<T> or ValueTask<T> to AIFunctionFactory.Create(). The tool invocation loop in ChatClientAgent awaits async tools correctly. Use async tools for any I/O-bound work like HTTP requests, database queries, or file operations to avoid blocking threads during execution.
