BrandGhost
Build an AI CLI Developer Tool with GitHub Copilot SDK in C#

Build an AI CLI Developer Tool with GitHub Copilot SDK in C#

When you build an AI CLI developer tool with GitHub Copilot SDK in C#, you get the full SDK feature set in a single application. Unlike a simple hello-world example, an AI CLI developer tool requires you to wire up CopilotClient and CopilotSession lifecycle management, streaming responses for real-time output, AIFunctionFactory tools that give the AI access to your local files, and a session reset mechanism for multi-conversation workflows. In this article, I'll walk through every line of a working console app that delivers all of this -- and I'll share the API discoveries I made along the way when the documented surface didn't quite match the package reality.

The full source is available in the devleader/copilot-sdk-examples repository.

What We're Building: an AI CLI Developer Tool with GitHub Copilot SDK

The app is an interactive REPL (Read-Evaluate-Print Loop) that runs in your terminal and lets you ask coding questions, get explanations, and -- crucially -- ask the AI to analyze your actual source files. It looks like this at runtime:

╔══════════════════════════════════════╗
║   AI CLI Developer Tool              ║
║   Powered by GitHub Copilot SDK      ║
╚══════════════════════════════════════╝

Type your coding question or command. Type /help for options.

> What files are in the current directory?
[Tool: list_files({"directory": "."})]
Contents of: C:devMyProject
  [DIR]  src/
  [FILE] MyProject.csproj  (1,234 bytes)
  [FILE] README.md  (512 bytes)

> Review the code in ./src/MyService.cs
[Tool: read_file({"path": "./src/MyService.cs"})]
I can see a few things worth noting in your `MyService.cs`...

The app demonstrates the four patterns that define real GitHub Copilot SDK apps: client/session lifecycle, streaming with AssistantMessageDeltaEvent, tool calls via AIFunctionFactory, and session scoping with /clear.

Project Setup

Start with a .NET 9 console app and add the required packages. The first important lesson here: AIFunction and AIFunctionFactory are NOT in GitHub.Copilot.SDK directly -- they live in Microsoft.Extensions.AI.Abstractions, which the SDK depends on but doesn't automatically expose for your direct using statements:

dotnet add package GitHub.Copilot.SDK --version 0.1.25
dotnet add package Microsoft.Extensions.AI.Abstractions --version 10.2.0
dotnet add package Microsoft.Extensions.Configuration.Json --version 9.0.2
dotnet add package Microsoft.Extensions.Configuration.Binder --version 9.0.2
dotnet add package Microsoft.Extensions.Configuration.EnvironmentVariables --version 9.0.2

Your .csproj should end up with:

<Project Sdk="Microsoft.NET.Sdk">
  <PropertyGroup>
    <OutputType>Exe</OutputType>
    <TargetFramework>net9.0</TargetFramework>
    <Nullable>enable</Nullable>
    <ImplicitUsings>enable</ImplicitUsings>
  </PropertyGroup>
  <ItemGroup>
    <PackageReference Include="GitHub.Copilot.SDK" Version="0.1.25" />
    <PackageReference Include="Microsoft.Extensions.AI.Abstractions" Version="10.2.0" />
    <PackageReference Include="Microsoft.Extensions.Configuration" Version="9.0.2" />
    <PackageReference Include="Microsoft.Extensions.Configuration.Json" Version="9.0.2" />
    <PackageReference Include="Microsoft.Extensions.Configuration.Binder" Version="9.0.2" />
    <PackageReference Include="Microsoft.Extensions.Configuration.EnvironmentVariables" Version="9.0.2" />
  </ItemGroup>
</Project>

The SDK requires the GitHub CLI (gh) to be installed and authenticated. If you haven't set it up yet, check the GitHub Copilot SDK installation guide first.

Typed Configuration

A sealed record pattern keeps settings clean and immutable. The GithubToken property maps to an environment variable or appsettings.Development.json so keys never go into source control:

namespace AiCliTool.Configuration;

public sealed class CopilotConfig
{
    public const string SectionName = "Copilot";

    public string Model { get; init; } = "gpt-5";
    public string? GithubToken { get; init; }
    public string SystemPrompt { get; init; } =
        "You are an expert AI coding assistant for .NET developers. " +
        "You help with code reviews, explaining concepts, writing code, " +
        "and debugging. You have tools to read files and list directories " +
        "when the user wants you to analyze their code. " +
        "Be concise but thorough. Use C# code examples where helpful.";
}

And the appsettings.json:

{
  "Copilot": {
    "Model": "gpt-5",
    "GithubToken": "",
    "SystemPrompt": "You are an expert AI coding assistant for .NET developers..."
  }
}

Reading config with Microsoft.Extensions.Configuration:

var configuration = new ConfigurationBuilder()
    .AddJsonFile("appsettings.json", optional: false)
    .AddJsonFile("appsettings.Development.json", optional: true)
    .AddEnvironmentVariables("COPILOT_")
    .Build();

var config = configuration
    .GetSection(CopilotConfig.SectionName)
    .Get<CopilotConfig>() ?? new CopilotConfig();

File System Tools with AIFunctionFactory

The FileSystemTools class exposes three methods as AI tools. The [Description] attributes are critical -- AIFunctionFactory reads them via reflection to generate the JSON schema that the model uses to decide when and how to call each tool:

using System.ComponentModel;

public sealed class FileSystemTools
{
    private const int MaxFileSizeBytes = 50_000;

    [Description("Read the contents of a source code file. Use this when the user asks you " +
                 "to review, explain, or analyze a specific file.")]
    public string ReadFile(
        [Description("The absolute or relative path to the file to read.")]
        string path)
    {
        var fullPath = Path.GetFullPath(path);
        if (!File.Exists(fullPath))
            return $"[Error] File not found: {fullPath}";

        var fileInfo = new FileInfo(fullPath);
        if (fileInfo.Length > MaxFileSizeBytes)
            return $"[Error] File too large ({fileInfo.Length:N0} bytes). Max: {MaxFileSizeBytes:N0}";

        return File.ReadAllText(fullPath);
    }

    [Description("List files in a directory matching an optional glob pattern. " +
                 "Useful for exploring project structure before reading specific files.")]
    public string ListFiles(
        [Description("The directory path to list. Use '.' for the current directory.")]
        string directory,
        [Description("Optional glob pattern, e.g. '*.cs', '*.json'. Leave empty for all files.")]
        string pattern = "*")
    {
        var fullPath = Path.GetFullPath(directory);
        if (!Directory.Exists(fullPath))
            return $"[Error] Directory not found: {fullPath}";

        var files = Directory.GetFiles(fullPath, pattern, SearchOption.TopDirectoryOnly);
        var dirs = Directory.GetDirectories(fullPath);

        var sb = new System.Text.StringBuilder();
        sb.AppendLine($"Contents of: {fullPath}");
        foreach (var dir in dirs.OrderBy(d => d))
            sb.AppendLine($"[DIR]  {Path.GetFileName(dir)}/");
        foreach (var file in files.OrderBy(f => f))
            sb.AppendLine($"[FILE] {Path.GetFileName(file)}  ({new FileInfo(file).Length:N0} bytes)");

        return sb.ToString();
    }

    [Description("Get the current working directory of the CLI tool.")]
    public string GetCurrentDirectory() => Directory.GetCurrentDirectory();
}

The 50KB file size limit is a practical guard -- you don't want to accidentally stuff a 2MB generated file into the AI's context window. The [Description] on each parameter is just as important as the method-level description; the model uses them to understand what values to pass.

Registering the tools uses AIFunctionFactory.Create with an explicit name for each function. The SDK passes the name to the model, which uses it when reporting tool calls:

var fsTools = new FileSystemTools();
var tools = new List<AIFunction>
{
    AIFunctionFactory.Create(fsTools.ReadFile, name: "read_file"),
    AIFunctionFactory.Create(fsTools.ListFiles, name: "list_files"),
    AIFunctionFactory.Create(fsTools.GetCurrentDirectory, name: "get_current_directory"),
};

You can learn more about custom AI tools with AIFunctionFactory in my dedicated deep-dive.

CopilotClient Startup

The CopilotClient and CopilotSession core concepts article covers the full lifecycle model. For a CLI app, you create one client for the entire process lifetime. StartAsync() spawns the GitHub CLI subprocess that proxies requests to the Copilot service:

var clientOptions = new CopilotClientOptions();
if (!string.IsNullOrWhiteSpace(config.GithubToken))
    clientOptions.GithubToken = config.GithubToken;

await using var client = new CopilotClient(clientOptions);
await client.StartAsync();

CopilotClient implements IAsyncDisposable, so await using ensures the CLI subprocess is cleanly terminated when your app exits. The client is expensive to create -- there's only one per app lifetime. Sessions, on the other hand, are cheap and created per conversation.

Session Creation with SystemMessage and Tools

Sessions hold conversation context. You configure a system prompt via SystemMessageConfig and pass the registered tools through SessionConfig.Tools. The SystemMessageMode.Append value adds your system prompt alongside the default Copilot system message rather than replacing it entirely:

async Task StartNewSessionAsync()
{
    if (session is not null)
        await session.DisposeAsync();

    session = await client.CreateSessionAsync(new SessionConfig
    {
        Model = config.Model,
        Streaming = true,
        SystemMessage = new SystemMessageConfig
        {
            Mode = SystemMessageMode.Append,
            Content = config.SystemPrompt
        },
        Tools = tools
    });
}

The managing sessions guide has more on when to create new sessions versus reusing existing ones. For this CLI, we create a new session on /clear to reset conversation context -- the client keeps running but the AI forgets everything from the previous conversation.

Streaming Event Handler

This is the core of the CLI's responsiveness. The event-driven API fires events as the AI generates tokens -- you subscribe with session.On(...) and handle each event type:

static async Task SendMessageAsync(CopilotSession session, string prompt)
{
    var tcs = new TaskCompletionSource();

    session.On(evt =>
    {
        switch (evt)
        {
            case AssistantMessageDeltaEvent delta:
                Console.Write(delta.Data.DeltaContent);
                break;

            case AssistantMessageEvent msg:
                // Non-streaming fallback -- fires if Streaming = false
                Console.Write(msg.Data.Content);
                break;

            case ToolExecutionStartEvent toolStart:
                Console.ForegroundColor = ConsoleColor.Yellow;
                Console.WriteLine($"
[Tool: {toolStart.Data.ToolName}({toolStart.Data.Arguments})]");
                Console.ResetColor();
                break;

            case SessionIdleEvent:
                // AI has finished responding -- the turn is complete
                Console.WriteLine();
                tcs.TrySetResult();
                break;

            case SessionErrorEvent err:
                Console.ForegroundColor = ConsoleColor.Red;
                Console.WriteLine($"
[Error] {err.Data.ErrorType}: {err.Data.Message}");
                Console.ResetColor();
                tcs.TrySetException(new Exception(err.Data.Message));
                break;
        }
    });

    await session.SendAsync(new MessageOptions { Prompt = prompt });
    await tcs.Task;
}

The TaskCompletionSource bridge is the key pattern here. session.SendAsync() returns immediately -- it kicks off the request and returns. Without the TCS, your app would continue to the next iteration of the REPL before the response finished. SessionIdleEvent fires when the model has finished its entire turn (including any tool calls and the final response), so awaiting tcs.Task gives you the natural "wait for complete response" behavior.

One API detail worth noting: SessionErrorData.ErrorType is the correct property to identify the kind of error (not .Code, which doesn't exist in v0.1.25).

For a deeper look at streaming patterns including Server-Sent Events for ASP.NET Core, check out the streaming responses guide.

The Interactive REPL Loop

The REPL reads a line, checks for slash commands, and delegates to SendMessageAsync. Slash commands provide the essential developer UX without requiring any special SDK work:

while (true)
{
    Console.ForegroundColor = ConsoleColor.Green;
    Console.Write("> ");
    Console.ResetColor();

    var input = Console.ReadLine()?.Trim();
    if (string.IsNullOrWhiteSpace(input))
        continue;

    if (input.StartsWith('/'))
    {
        switch (input.ToLowerInvariant())
        {
            case "/exit":
            case "/quit":
                if (session is not null)
                    await session.DisposeAsync();
                return;

            case "/clear":
                await StartNewSessionAsync();
                continue;

            case "/help":
                PrintHelp();
                continue;

            default:
                Console.WriteLine($"Unknown command: {input}");
                continue;
        }
    }

    await SendMessageAsync(session!, input);
}

The /clear path deserves attention. It calls StartNewSessionAsync(), which disposes the current session and creates a new one. The new session has no conversation history -- the AI starts fresh. But the client, the tool registrations, and the configuration are all unchanged. This is the intended CopilotClient vs CopilotSession separation in action: the client is the infrastructure, the session is the context.

Running the App

After setting your GitHub token in appsettings.Development.json:

{
  "Copilot": {
    "GithubToken": "ghp_your_token_here"
  }
}

Run it:

dotnet run

Example session:

> What are the main differences between Task and ValueTask in C#?
Task allocates on the heap for every operation. ValueTask is a struct that avoids
allocation when the result is available synchronously -- common in hot paths like
cache hits...

> Review the code in ./src/OrderService.cs
[Tool: read_file({"path": "./src/OrderService.cs"})]
I see a few things in your OrderService. The `ProcessOrder` method has a potential
race condition on line 47 where...

> /clear
[New session started]

> What were we just talking about?
I don't have any previous conversation context -- this is a fresh session.

The last exchange demonstrates why /clear exists: a new session truly has no memory of the previous conversation.

Putting It All Together: Key Patterns

This app demonstrates several patterns from across the GitHub Copilot SDK for .NET series:

Pattern Where Used
CopilotClient singleton + StartAsync() Program.cs startup, one instance per process
CopilotSession per conversation StartNewSessionAsync(), disposed on /clear
Streaming = true + AssistantMessageDeltaEvent Real-time token output to console
AIFunctionFactory.Create() Registers file system tools with JSON schema
session.On(...) event handler Handles all event types in a single switch
TaskCompletionSource bridge Converts event-driven API to awaitable
SystemMessageConfig + Append mode Injects coding assistant persona
IAsyncDisposable pattern await using for client and session cleanup

The getting started guide covers the basics of each of these patterns individually. This app combines all of them in a single coherent workflow.

Frequently Asked Questions

Why does the app need Microsoft.Extensions.AI.Abstractions as a direct package reference?

AIFunction and AIFunctionFactory are defined in Microsoft.Extensions.AI.Abstractions, which GitHub.Copilot.SDK depends on transitively. However, .NET doesn't automatically make transitive packages available for direct using statements in your code. You must add Microsoft.Extensions.AI.Abstractions as an explicit <PackageReference> in your .csproj to use AIFunctionFactory in your own classes.

Why use TaskCompletionSource instead of just awaiting SendAsync?

session.SendAsync() is not like HttpClient.GetAsync() -- it doesn't return the result. It initiates the request and returns immediately. The SDK delivers results through events. TaskCompletionSource creates an awaitable that completes when SessionIdleEvent fires, bridging the event-driven model into the async/await code you're already using everywhere else.

What does SystemMessageMode.Append do vs Replace?

Append adds your system message alongside the default Copilot system prompt (which includes built-in safety and behavior guidelines). Replace substitutes your message entirely. For coding tools, Append is almost always the right choice -- it adds your persona ("expert C# developer") without stripping the built-in guidelines that make responses safe and sensible.

How do I prevent the AI from reading sensitive files?

The FileSystemTools implementation only reads files the operating system allows -- it runs with your process's user permissions. For extra safety, you can add path allowlist checks in ReadFile. A common pattern is to only allow reads from paths under the current working directory, refusing any path that navigates up (..) or to an absolute location outside your project.

Can I add async tools with AIFunctionFactory?

Yes. AIFunctionFactory.Create() accepts both synchronous and asynchronous methods. An async tool would look like:

[Description("Search the web for documentation on a topic.")]
public async Task<string> SearchDocsAsync(
    [Description("The search query.")]
    string query)
{
    // Async HTTP call
    return await _httpClient.GetStringAsync($"https://docs.example.com/search?q={query}");
}

// Register the same way:
AIFunctionFactory.Create(myTools.SearchDocsAsync, name: "search_docs")

The SDK handles async tool invocation transparently.

What happens if the GitHub CLI isn't installed?

client.StartAsync() will throw because the SDK cannot find the gh executable to launch. The error message typically includes the CLI path it attempted. Verify installation with gh --version and gh auth status before running the app.

Should I reuse one session for all users in a multi-user app?

No. Each user should get their own session. Sharing a session between users leaks conversation context -- user A can see questions user B asked. For web applications, use one session per user per conversation thread, and dispose it when the conversation ends. The client (infrastructure) can be shared as a singleton, but sessions (context) must be isolated per user.

Getting Started with GitHub Copilot SDK in C#: Installation, Setup, and First Conversation

Getting started with GitHub Copilot SDK in C#: master installation, CopilotClient setup, streaming responses, and build your first .NET AI app.

Custom AI Tools with AIFunctionFactory in GitHub Copilot SDK for C#

Learn to build custom AI tools with AIFunctionFactory in GitHub Copilot SDK for C#. Working code examples and best practices included.

GitHub Copilot SDK for .NET: Complete Developer Guide

Learn the GitHub Copilot SDK for .NET in this complete developer guide. Build custom AI agents with CopilotClient, CopilotSession, streaming, tools, and multi-model support in C#.

An error has occurred. This application may no longer respond until reloaded. Reload