If you build .NET applications and you want to tap directly into the same AI engine that powers GitHub Copilot, the GitHub Copilot SDK for .NET is worth your attention. Released in early 2026 as a technical preview, this SDK exposes Copilot's full agentic capabilities -- multi-turn conversations, streaming, tool calling, and multi-model routing -- as a first-class .NET library you can embed directly in your own applications.
This guide covers everything you need to know about the GitHub Copilot SDK from the ground up: what it is, how it compares to Semantic Kernel, how to set it up, and the key concepts you need to build production agents in C#.
What Is the GitHub Copilot SDK for .NET?
The GitHub Copilot SDK (NuGet: GitHub.Copilot.SDK) is a multi-platform SDK that exposes the GitHub Copilot CLI's agent runtime to application developers. Instead of using Copilot only through an IDE extension, you can now embed Copilot's intelligence directly into .NET console apps, ASP.NET Core APIs, background services, and CLI tools.
As of February 2026, the latest release is v0.1.25. The SDK source and documentation live in the official GitHub repository, with a getting started guide at docs/getting-started.md.
Key Capabilities
The GitHub Copilot SDK for .NET provides:
- Multi-turn conversations: Stateful sessions with full context management across turns
- Streaming responses: Real-time streaming via
AssistantMessageDeltaEventfor low-latency UX - Tool calling: Register C# methods as AI tools using
AIFunctionFactoryfromMicrosoft.Extensions.AI - Multi-model routing: Choose between GPT-5, Claude Sonnet, and other models per session
- Microsoft Agent Framework integration: The SDK sessions can work alongside the Microsoft Agent Framework (integration patterns are evolving -- see the SDK repository for latest updates)
- Session hooks: Lifecycle events for monitoring, logging, and control
This isn't just a wrapper around a REST API -- it's the same production runtime that powers the Copilot CLI, battle-tested for agentic, multi-step AI workflows.
Prerequisites and Installation
To use the GitHub Copilot SDK for .NET, you need:
- .NET 8.0 or later -- the SDK targets .NET 8+
- GitHub Copilot CLI -- installed and authenticated (install guide)
- GitHub Copilot subscription -- individual, team, or enterprise plan
Install the SDK NuGet package:
dotnet add package GitHub.Copilot.SDK
For tool calling support (recommended):
dotnet add package Microsoft.Extensions.AI
The Copilot CLI acts as a local agent host that the SDK communicates with. This design means your API keys and authentication are managed by the GitHub CLI, not stored in your application code -- a significant security advantage.
Core Architecture: CopilotClient and CopilotSession
The GitHub Copilot SDK is built around two primary classes that you'll use in every application.
CopilotClient
CopilotClient manages the connection to the Copilot CLI agent host. It handles startup, authentication, and the lifecycle of the underlying process. It implements IAsyncDisposable, so you should use await using:
using GitHub.Copilot.SDK;
await using var client = new CopilotClient();
await client.StartAsync();
Console.WriteLine("Connected to GitHub Copilot CLI agent");
The client starts the Copilot CLI process and establishes a local connection. You can pass CopilotClientOptions to customize the CLI path or port if needed.
CopilotSession
CopilotSession represents a single conversation. Each session has its own model configuration, tool registrations, and conversation history. Sessions also implement IAsyncDisposable:
await using var session = await client.CreateSessionAsync(new SessionConfig
{
Model = "gpt-5" // or "claude-sonnet-4.5", "gpt-4.1", etc.
});
Multiple sessions can run concurrently from the same client, which enables multi-agent patterns where each agent has its own isolated context.
Sending Messages
Send a message and collect the response using the event-driven API:
var tcs = new TaskCompletionSource();
var responseBuilder = new System.Text.StringBuilder();
session.On(evt =>
{
switch (evt)
{
case AssistantMessageEvent msg:
responseBuilder.Append(msg.Data.Content);
break;
case SessionIdleEvent:
tcs.TrySetResult();
break;
case SessionErrorEvent err:
Console.WriteLine($"Error: {err.Data.Message}");
tcs.TrySetException(new Exception(err.Data.Message));
break;
}
});
await session.SendAsync(new MessageOptions
{
Prompt = "Explain what the GitHub Copilot SDK does in one paragraph."
});
await tcs.Task;
Console.WriteLine(responseBuilder.ToString());
Streaming Responses in Real Time
One of the most compelling features of the GitHub Copilot SDK for .NET is real-time streaming. Instead of waiting for a complete response, you can display tokens as they arrive -- exactly like the Copilot chat experience in VS Code.
Streaming is enabled via AssistantMessageDeltaEvent, which fires for each token chunk as it arrives:
await using var client = new CopilotClient();
await client.StartAsync();
await using var session = await client.CreateSessionAsync(new SessionConfig
{
Model = "gpt-5",
Streaming = true
});
var tcs = new TaskCompletionSource();
session.On(evt =>
{
switch (evt)
{
case AssistantMessageDeltaEvent delta:
// Print each token as it arrives -- no newline, streaming output
Console.Write(delta.Data.DeltaContent);
break;
case SessionIdleEvent:
Console.WriteLine(); // Newline after full response
tcs.TrySetResult();
break;
case SessionErrorEvent err:
tcs.TrySetException(new Exception(err.Data.Message));
break;
}
});
await session.SendAsync(new MessageOptions
{
Prompt = "Write a C# method that calculates the Fibonacci sequence."
});
await tcs.Task;
This streaming pattern is ideal for any UI where responsiveness matters -- web chat interfaces, CLI tools, and terminal applications all benefit significantly from displaying partial responses instead of making users wait.
Custom AI Tools with AIFunctionFactory
The GitHub Copilot SDK integrates with Microsoft.Extensions.AI for tool registration. This is the same tooling abstraction used by Semantic Kernel and other .NET AI libraries, making your tools portable across AI frameworks.
AIFunctionFactory.Create() wraps any C# method as an AI-callable function:
using Microsoft.Extensions.AI;
using GitHub.Copilot.SDK;
// Define tools as regular C# methods
string GetCurrentTime() => DateTime.UtcNow.ToString("O");
async Task<string> SearchDocumentationAsync(string query)
{
// In production, hit a real search API
await Task.Delay(50);
return $"Found 3 results for '{query}' in the documentation.";
}
// Register tools with the session
var tools = new[]
{
AIFunctionFactory.Create(GetCurrentTime, "get_current_time", "Returns the current UTC time"),
AIFunctionFactory.Create(SearchDocumentationAsync, "search_docs", "Searches the documentation for a given query")
};
await using var session = await client.CreateSessionAsync(new SessionConfig
{
Model = "gpt-5",
Tools = tools
});
await session.SendAsync(new MessageOptions
{
Prompt = "What time is it, and can you search the docs for 'async patterns'?"
});
When the AI decides a tool is needed, it invokes the function automatically and incorporates the result into its response -- the same function calling pattern you'll see discussed in plugin architecture for .NET.
Multi-Model Support
One of the standout features of the GitHub Copilot SDK for .NET is first-class multi-model support. Rather than being locked to one model, you specify the model per session:
// Use gpt-5 for complex reasoning tasks
await using var reasoningSession = await client.CreateSessionAsync(new SessionConfig
{
Model = "gpt-5"
});
// Use Claude Sonnet for creative writing and long-form content
await using var creativeSession = await client.CreateSessionAsync(new SessionConfig
{
Model = "claude-sonnet-4.5"
});
// Use GPT-4.1 for fast, lightweight tasks
await using var fastSession = await client.CreateSessionAsync(new SessionConfig
{
Model = "gpt-4.1"
});
This flexibility is valuable for cost optimization and performance tuning -- routing different task types to the best-suited model without changing your application code.
Microsoft Agent Framework Integration
The GitHub Copilot SDK integrates with the Microsoft Agent Framework, which is the same framework used by Semantic Kernel's Agent Framework. This means your Copilot-powered agent can participate in multi-agent workflows alongside Semantic Kernel agents and AutoGen agents.
Note: Microsoft Agent Framework integration via
AsAIAgent()is planned for the GitHub Copilot SDK but is not yet documented in the official SDK README at time of writing. Check the official repository for the latest updates on this integration.
This convergence is documented in the Microsoft Semantic Kernel blog and represents a significant step toward a unified .NET AI agent ecosystem.
GitHub Copilot SDK vs. Semantic Kernel
Both the GitHub Copilot SDK and Semantic Kernel are first-class .NET AI frameworks from Microsoft's ecosystem. Choosing between them -- or deciding to use both -- is a common question.
The key distinction is that the Copilot SDK is optimized for developer workflow scenarios with tight GitHub integration, while Semantic Kernel is a general-purpose AI orchestration framework that is model-agnostic and enterprise-first. If you're building tools that work with code repositories, pull requests, or GitHub data, the Copilot SDK gives you native access to that context. For broader enterprise AI applications, Semantic Kernel's plugin ecosystem and vector store integrations are a better fit.
In practice, many teams use both: the Copilot SDK for developer-facing tools and the Microsoft Agent Framework for connecting them together. The cross-link article GitHub Copilot SDK vs. Semantic Kernel: When to Use Each covers this decision in depth once you've learned both frameworks.
If you're new to AI-powered .NET development, the getting started with AI coding tools guide is a great primer for the broader landscape before diving into either SDK.
Session Lifecycle and Resource Management
Because CopilotClient starts a background process and CopilotSession holds an active connection, proper resource management is critical to avoid resource leaks. The SDK is designed for await using patterns throughout:
// Correct pattern: nested await using ensures proper disposal order
await using var client = new CopilotClient();
await client.StartAsync();
await using var session = await client.CreateSessionAsync(new SessionConfig
{
Model = "gpt-5"
});
// Use session...
// session.DisposeAsync() called automatically → client.DisposeAsync() called automatically
The async/await patterns in C# you already know apply directly here. The event-driven session.On(...) API is non-blocking, and the TaskCompletionSource pattern for collecting responses is idiomatic .NET async code.
If you're using the SDK in ASP.NET Core, register CopilotClient as a singleton and create sessions per request or per conversation thread -- similar to how you'd scope a database connection. The IServiceCollection dependency injection patterns you're familiar with apply directly.
FAQ
The following questions are common when developers first encounter the GitHub Copilot SDK for .NET. These cover prerequisites, capabilities, and architectural decisions to help you get started confidently.
What is the GitHub Copilot SDK for .NET?
The GitHub Copilot SDK is a .NET library that exposes GitHub Copilot's AI agent runtime as a programmable API. It lets you build custom AI applications using Copilot's infrastructure -- including multi-turn conversations, tool calling, and streaming -- without going through the Copilot IDE extension.
Does the GitHub Copilot SDK require a Copilot subscription?
Yes -- the SDK requires an active GitHub Copilot subscription (individual, team, or enterprise). It also requires the GitHub Copilot CLI to be installed and authenticated on the machine running your application.
What models does the GitHub Copilot SDK support?
As of early 2026, the SDK confirms support for gpt-5 and claude-sonnet-4.5 in its official documentation. Other models (such as GPT-5.2, GPT-4.1) may be available depending on your Copilot subscription -- check the official repository for the current list. The model is specified per session in SessionConfig { Model = "model-id" }. Model availability depends on your Copilot subscription tier.
How do tools work in the GitHub Copilot SDK?
Tools are registered using AIFunctionFactory.Create() from Microsoft.Extensions.AI. The AI model automatically invokes registered tools when it determines they're needed to answer a user's request -- the same function calling pattern used in Semantic Kernel plugins.
Can I use the GitHub Copilot SDK in ASP.NET Core?
Yes -- register CopilotClient as a singleton in your DI container and create sessions per request or conversation. The SDK is fully compatible with .NET's dependency injection system and works in any .NET 8+ host.
Is the GitHub Copilot SDK production-ready?
As of early 2026, the SDK is labeled as a technical preview. APIs may change between minor versions. It is suitable for internal tools and early-adopter production scenarios but check the GitHub releases page for stability status before committing to it in critical production systems.
How does the GitHub Copilot SDK differ from Semantic Kernel?
The GitHub Copilot SDK is optimized for developer-workflow scenarios with deep GitHub integration. Semantic Kernel is a model-agnostic general-purpose AI orchestration framework. Both are from Microsoft's ecosystem and can interoperate via the Microsoft Agent Framework. Use the Copilot SDK for developer tools and GitHub-centric workflows; use Semantic Kernel for enterprise AI applications with custom data, RAG, and broad LLM provider support.
Wrapping Up
The GitHub Copilot SDK for .NET opens up Copilot's full agentic capabilities for .NET developers. With CopilotClient and CopilotSession as your building blocks, you can build streaming AI assistants, custom developer tools, and multi-agent workflows -- all backed by the same runtime that powers GitHub Copilot CLI.
This series dives deep into each capability. Next: a complete guide to getting started with the GitHub Copilot SDK in C#, covering installation, your first conversation, session lifecycle management, and the streaming event model in detail.
