Reducing a sequence down to a single meaningful value is one of the most frequent operations in any data-intensive .NET application. LINQ aggregation in C# covers everything from the mundane (Count, Sum) to the flexible (Aggregate), and .NET 6 added MinBy and MaxBy -- a long-overdue pair of operators that return the element with the minimum or maximum projected value, not just the value itself. This article walks through every aggregation operator, explains the Count vs Any performance trap, and shows how Aggregate can replace multiple passes over a sequence with a single, efficient fold.
The Domain Model
All examples use an Order, Product, and SalesData domain:
namespace DevLeader.LinqAggregation;
public record Order(int Id, string CustomerId, string Status, decimal Total, DateTimeOffset PlacedAt);
public record Product(int Id, string Name, string Category, decimal Price, int StockLevel);
public record SalesData(string Region, string ProductId, int UnitsSold, decimal Revenue, DateTimeOffset Period);
Count and LongCount
Count() returns the number of elements in a sequence as an int. Count(predicate) counts only the elements satisfying the predicate:
namespace DevLeader.LinqAggregation;
IEnumerable<Order> orders = GetOrders();
int total = orders.Count();
int pendingCount = orders.Count(o => o.Status == "Pending");
int paidCount = orders.Count(o => o.Status == "Paid");
Console.WriteLine($"Total: {total}, Pending: {pendingCount}, Paid: {paidCount}");
For sequences with more than int.MaxValue elements (about 2.1 billion), use LongCount(), which returns long:
namespace DevLeader.LinqAggregation;
// For extremely large datasets, e.g. log aggregation
long recordCount = GetAllAuditLogs().LongCount();
In practice, you'll rarely need LongCount -- but if you're processing telemetry or log data at scale, it's there.
Count vs Any -- A Real Performance Trap
This is one of the most common LINQ performance mistakes:
namespace DevLeader.LinqAggregation;
IEnumerable<Order> orders = GetOrders();
// ❌ Count() evaluates the entire sequence -- O(n)
if (orders.Count(o => o.Status == "Pending") > 0)
{
Console.WriteLine("There are pending orders.");
}
// ✅ Any() stops at the first match -- O(1) best case
if (orders.Any(o => o.Status == "Pending"))
{
Console.WriteLine("There are pending orders.");
}
The difference matters most when the sequence is large, when the source is a database query, or when the enumerable is expensive to materialize. Any(predicate) short-circuits on the first matching element. Count(predicate) must evaluate every element to produce an accurate count, even if the answer to "is there at least one?" would be known immediately.
A related trap: checking Count() == 0 to test for an empty sequence:
// ❌ Enumerates the whole sequence
if (orders.Count() == 0) { }
// ✅ Stops at first element
if (!orders.Any()) { }
Use Any() for existence checks, Count() only when you genuinely need the number.
Sum and Average
Sum(selector) and Average(selector) accept a selector to project each element to a numeric value before aggregating:
namespace DevLeader.LinqAggregation;
IEnumerable<Order> orders = GetOrders();
decimal totalRevenue = orders.Sum(o => o.Total);
decimal averageOrder = orders.Average(o => o.Total);
Console.WriteLine($"Total Revenue: ${totalRevenue:F2}");
Console.WriteLine($"Average Order: ${averageOrder:F2}");
For nullable numeric types, Sum returns 0 if the sequence is empty or all elements are null -- it never returns null. Average throws InvalidOperationException on an empty sequence -- guard with Any() or a null-conditional approach:
namespace DevLeader.LinqAggregation;
IEnumerable<Order> recentOrders = GetRecentOrders();
decimal? safeAverage = recentOrders.Any()
? recentOrders.Average(o => o.Total)
: null;
Practical example -- revenue by product category:
namespace DevLeader.LinqAggregation;
IEnumerable<SalesData> sales = GetSalesData();
var revenueByRegion = sales
.GroupBy(s => s.Region)
.Select(g => new
{
Region = g.Key,
TotalRevenue = g.Sum(s => s.Revenue),
AverageUnits = g.Average(s => s.UnitsSold)
})
.OrderByDescending(x => x.TotalRevenue);
foreach (var row in revenueByRegion)
{
Console.WriteLine($"{row.Region}: ${row.TotalRevenue:F0} revenue, {row.AverageUnits:F1} avg units");
}
Min and Max
Min(selector) and Max(selector) return the minimum or maximum projected value -- a scalar, not the element:
namespace DevLeader.LinqAggregation;
IEnumerable<Product> products = GetProducts();
decimal cheapest = products.Min(p => p.Price);
decimal mostExpensive = products.Max(p => p.Price);
Console.WriteLine($"Price range: ${cheapest:F2} -- ${mostExpensive:F2}");
The limitation: if you need the cheapest product (not just its price), you'd previously do this:
// Before .NET 6 -- find the product with the minimum price
decimal minPrice = products.Min(p => p.Price);
Product? cheapestProduct = products.FirstOrDefault(p => p.Price == minPrice);
This enumerates the sequence twice and has a subtle bug: if multiple products share the minimum price, FirstOrDefault picks whichever happens to come first -- which may not be the intended behavior. It also allocates no extra objects but does iterate the source twice.
MinBy and MaxBy (.NET 6): Return the Element, Not the Value
MinBy(keySelector) and MaxBy(keySelector) return the element that has the minimum or maximum key value -- in a single pass, with no double enumeration:
namespace DevLeader.LinqAggregation;
IEnumerable<Product> products = GetProducts();
// Before .NET 6 -- two passes, potential bug if min price is shared
decimal minPrice = products.Min(p => p.Price);
Product? cheapestOld = products.FirstOrDefault(p => p.Price == minPrice);
// .NET 6 -- single pass, returns the element directly
Product? cheapestNew = products.MinBy(p => p.Price);
Product? mostExpensive = products.MaxBy(p => p.Price);
Console.WriteLine($"Cheapest: {cheapestNew?.Name} at ${cheapestNew?.Price:F2}");
Console.WriteLine($"Most expensive: {mostExpensive?.Name} at ${mostExpensive?.Price:F2}");
When multiple elements share the same minimum/maximum key, MinBy and MaxBy return the first one encountered -- consistent with OrderBy().First() semantics.
Real-world scenario -- finding the best-performing sales region:
namespace DevLeader.LinqAggregation;
IEnumerable<SalesData> sales = GetSalesData();
// Most revenue in a single period entry
SalesData? topPeriod = sales.MaxBy(s => s.Revenue);
Console.WriteLine($"Top period: {topPeriod?.Region} -- ${topPeriod?.Revenue:F2}");
// Worst-performing region by total revenue
var revenueSummary = sales
.GroupBy(s => s.Region)
.Select(g => (Region: g.Key, Total: g.Sum(s => s.Revenue)));
(string Region, decimal Total) worstRegion = revenueSummary.MinBy(r => r.Total);
Console.WriteLine($"Lowest revenue region: {worstRegion.Region} (${worstRegion.Total:F2})");
MinBy and MaxBy also pair naturally with the C# enum switch pattern when you need to select the "best" option from an enum-keyed result set.
Aggregate: Custom Reduction
Aggregate is the most general aggregation operator. It takes a seed value and an accumulator function, folding each element into the accumulated result:
namespace DevLeader.LinqAggregation;
IEnumerable<Order> orders = GetOrders();
// Custom: concatenate order IDs as a comma-separated string
string orderList = orders.Aggregate(
string.Empty,
(acc, order) => string.IsNullOrEmpty(acc)
? order.Id.ToString()
: $"{acc},{order.Id}");
Console.WriteLine($"Order IDs: {orderList}");
Practical use case -- computing a running product discount:
namespace DevLeader.LinqAggregation;
decimal[] discounts = [0.10m, 0.05m, 0.15m]; // 10%, 5%, 15%
// Combined discount factor (multiplicative)
decimal combinedFactor = discounts.Aggregate(
1.0m,
(factor, discount) => factor * (1 - discount));
Console.WriteLine($"Combined factor: {combinedFactor:P2}"); // e.g., 72.54%
Aggregate with Result Selector
The three-argument overload adds a final projection step, applied to the accumulated result after all elements have been processed:
namespace DevLeader.LinqAggregation;
IEnumerable<SalesData> sales = GetSalesData();
// Aggregate to (total revenue, count), then project to average
decimal averageRevenue = sales.Aggregate(
seed: (Total: 0m, Count: 0),
func: (acc, s) => (acc.Total + s.Revenue, acc.Count + 1),
resultSelector: acc => acc.Count > 0 ? acc.Total / acc.Count : 0m);
Console.WriteLine($"Average Revenue: ${averageRevenue:F2}");
This is a single-pass average -- useful when you want to avoid iterating the sequence twice (once for Sum, once for Count).
Multiple Aggregations in One Pass
Computing several aggregations over the same sequence naively means iterating it multiple times. For IEnumerable sources backed by expensive computation (network calls, file reads, EF Core queries), this matters. Use Aggregate to fold multiple accumulators together:
namespace DevLeader.LinqAggregation;
public record AggregateSummary(
int Count,
decimal Sum,
decimal Min,
decimal Max);
IEnumerable<Order> orders = GetOrders();
AggregateSummary summary = orders.Aggregate(
new AggregateSummary(0, 0m, decimal.MaxValue, decimal.MinValue),
(acc, order) => new AggregateSummary(
acc.Count + 1,
acc.Sum + order.Total,
Math.Min(acc.Min, order.Total),
Math.Max(acc.Max, order.Total)));
decimal average = summary.Count > 0 ? summary.Sum / summary.Count : 0m;
Console.WriteLine($"Count: {summary.Count}");
Console.WriteLine($"Sum: ${summary.Sum:F2}");
Console.WriteLine($"Min: ${summary.Min:F2}");
Console.WriteLine($"Max: ${summary.Max:F2}");
Console.WriteLine($"Average: ${average:F2}");
One pass, four values. This pattern is particularly valuable in CQRS query handlers where the query result needs multiple statistics and the data source is a database or external service.
TryGetNonEnumeratedCount (.NET 6)
Before computing aggregations that require the count (e.g., computing an average without iterating twice), check whether the count is available without enumerating:
namespace DevLeader.LinqAggregation;
IEnumerable<Order> orders = GetOrders();
// .NET 6: get count without enumerating if the source supports it (List, Array, etc.)
if (orders.TryGetNonEnumeratedCount(out int count))
{
Console.WriteLine($"Fast count: {count}");
// Can now use this count without iterating
}
else
{
// Fallback: must enumerate to count
count = orders.Count();
}
TryGetNonEnumeratedCount returns true for List<T>, arrays, HashSet<T>, Dictionary<TKey, TValue>, and other types that implement ICollection<T>. It returns false for lazy pipelines, GroupBy results, and any IEnumerable that must be evaluated to know its length.
Choosing the Right Aggregation Operator
| Scenario | Operator |
|---|---|
| Test for any matching element | Any(predicate) |
| Count elements | Count() / Count(predicate) |
| Sum of projected values | Sum(selector) |
| Average of projected values | Average(selector) |
| Minimum/maximum scalar value | Min(selector) / Max(selector) |
| Element with minimum key (.NET 6+) | MinBy(keySelector) |
| Element with maximum key (.NET 6+) | MaxBy(keySelector) |
| Custom per-key accumulation (.NET 9) | AggregateBy |
| Key frequency count (.NET 9) | CountBy |
| General fold with seed | Aggregate(seed, func) |
| General fold with result projection | Aggregate(seed, func, resultSelector) |
For anything involving per-key aggregation in .NET 9, prefer CountBy and AggregateBy over GroupBy -- they're cleaner and more expressive for that specific case.
Architecture Considerations
Aggregation logic tends to accumulate in service layers. A few patterns to keep it maintainable:
- Encapsulate complex aggregations in query objects. The CQRS with feature slices pattern is an excellent fit -- a
GetSalesSummaryQueryreturns a typed result that already encapsulates the aggregation logic. - Use the observer pattern for incremental aggregation. Instead of re-aggregating on every read, push new events through an observer that updates a running accumulator.
- Enum-based filtering before aggregation is common. See the C# enum complete guide for clean patterns when filtering on status or category enums before calling
SumorCount. - For feature-sliced codebases, aggregation queries belong inside the feature folder that owns the data -- not in a shared utility layer.
FAQ
Why should I use Any() instead of Count() > 0 in C#?
Any() short-circuits after finding the first matching element, giving O(1) best-case performance. Count() must evaluate every element in the sequence, resulting in O(n) performance regardless of whether one or a million matches exist. For existence checks, always prefer Any().
What is the difference between Min and MinBy in C#?
Min(selector) returns the minimum projected value -- a scalar like decimal or int. MinBy(keySelector) returns the element that has the minimum key value. For example, products.Min(p => p.Price) returns 9.99m, while products.MinBy(p => p.Price) returns the Product record with that price. MinBy was added in .NET 6.
How do I compute multiple aggregations in one pass over a sequence?
Use the Aggregate(seed, func) overload with a tuple or record as the seed, accumulating all values simultaneously. This avoids iterating the source multiple times. See the AggregateSummary example in this article for a complete pattern.
When should I use LongCount instead of Count?
Use LongCount() when your sequence may contain more than int.MaxValue (~2.1 billion) elements. In practice, this is rare and typically only relevant for large log, telemetry, or data warehouse processing scenarios.
Does Average throw an exception on an empty sequence?
Yes. Average() throws InvalidOperationException on an empty sequence. Guard with Any() before calling Average, or use the nullable overload Average(Func<T, decimal?>) with a null-conditional result.
Can I use Aggregate as a replacement for foreach loops?
Yes, when the operation is a reduction to a single result. If you need side effects (e.g., writing to a list or logging), a foreach loop is more readable and idiomatic. Reserve Aggregate for pure transformations where the result is a single accumulated value derived from the input sequence.
What is TryGetNonEnumeratedCount and when does it help?
TryGetNonEnumeratedCount (introduced in .NET 6) attempts to retrieve the element count of a sequence without iterating it. It succeeds for List<T>, arrays, and other ICollection<T> implementations. It is most useful when you need the count for pre-allocation or logging but want to avoid the performance cost of a full enumeration when the source already exposes a fast Count property.
Summary
LINQ aggregation in C# gives you a full toolkit for reducing sequences to meaningful values:
CountandAny-- useAnyfor existence checks,Countonly when you need the actual number.Sum,Average,Min,Max-- the standard numeric aggregators with selector support.MinByandMaxBy(.NET 6) -- return the element, not just the value, in a single pass.Aggregate-- the general-purpose fold for custom reductions and multi-value single-pass aggregation.TryGetNonEnumeratedCount(.NET 6) -- fast count for collection-backed sequences.
These operators compose cleanly with grouping (see LINQ grouping), joining, and filtering to build complete data pipelines. Pair them with the decorator pattern to add caching or instrumentation on top of aggregation logic, or with the factory method pattern to inject configurable aggregation strategies at runtime.

