LINQ Element Access in C#: First, Last, Single, Chunk, and TryGetNonEnumeratedCount
When you're working with collections in .NET, knowing how to reach into a sequence and pull out exactly what you need is fundamental. LINQ element access in C# gives you a rich toolkit -- First, Last, Single, ElementAt, Chunk, and more -- that goes well beyond a plain [0] index lookup. Pick the wrong operator, though, and you're either silently swallowing bugs or throwing unnecessary exceptions. This guide walks through every major element-access operator, explains when each one is the right tool, and covers the .NET 6 additions that make batching and count checks far more efficient.
First() and FirstOrDefault() -- The Workhorse Pair
First() returns the first element that satisfies a predicate (or the first element of the sequence if no predicate is given). It throws InvalidOperationException if the sequence is empty or no element matches. FirstOrDefault() does the same but returns default(T) instead of throwing. In .NET 6+ you can supply an explicit fallback value.
using System.Linq;
namespace Store.Queries;
public static class ProductQueries
{
public static Product GetCheapestOrFail(IEnumerable<Product> products)
{
// Throws if products is empty
return products.OrderBy(p => p.Price).First();
}
public static Product? GetCheapestOrNull(IEnumerable<Product> products)
{
return products.OrderBy(p => p.Price).FirstOrDefault();
}
// .NET 6 -- explicit fallback value instead of null
public static Product GetCheapestOrSentinel(IEnumerable<Product> products)
{
var sentinel = new Product { Name = "None", Price = 0m };
return products.OrderBy(p => p.Price).FirstOrDefault(sentinel);
}
}
Use First() when an empty result is a programming error. Use FirstOrDefault() when "not found" is a normal, expected outcome -- such as a catalog search that may legitimately return no results.
Last() and LastOrDefault()
Last() and LastOrDefault() mirror First() but scan from the end of the sequence. On a plain IEnumerable<T> these are O(n) because the runtime must read every element to find the tail. On IList<T> or IReadOnlyList<T>, LINQ short-circuits to a direct index lookup.
namespace Orders.Queries;
public static class OrderQueries
{
public static Order? GetMostRecentOrder(IEnumerable<Order> orders)
{
return orders
.OrderBy(o => o.PlacedAt)
.LastOrDefault();
}
public static Order GetMostRecentOrFail(IEnumerable<Order> orders)
{
return orders
.OrderBy(o => o.PlacedAt)
.Last();
}
}
If you're calling Last() repeatedly on a large unsorted collection and performance matters, consider flipping the sort order and calling First() instead -- the semantics are identical but the intent is explicit and the short-circuit behavior identical.
Single() and SingleOrDefault() -- When Exactly One Is Required
Single() throws if the sequence contains more than one matching element or if it contains none. That double-throw behavior is what distinguishes it from First().
namespace Catalog.Queries;
public static class CatalogQueries
{
// Enforces business rule: a SKU must be unique
public static Product GetBySku(IEnumerable<Product> products, string sku)
{
return products.Single(p => p.Sku == sku);
}
// Returns null if the SKU doesn't exist, still throws on duplicates
public static Product? FindBySku(IEnumerable<Product> products, string sku)
{
return products.SingleOrDefault(p => p.Sku == sku);
}
}
When you want to assert a data integrity invariant, Single() is your canary in the coal mine. A duplicate SKU is a bug -- let it throw loudly rather than silently returning an arbitrary match.
This matters when building feature-sliced handlers that look up domain entities by ID: using Single() in your query layer surfaces data-integrity problems at the exact point they occur rather than hours later as a downstream corruption.
ElementAt() and ElementAtOrDefault()
These operators give you positional access into a sequence without materializing the whole thing. When the underlying type is IList<T> or IReadOnlyList<T>, the runtime short-circuits to a direct index access -- O(1).
namespace Catalog.Queries;
public static class PaginatedCatalog
{
public static Product? GetAtPosition(IEnumerable<Product> products, int index)
{
// Returns default(Product?) if index is out of range
return products.ElementAtOrDefault(index);
}
}
.NET 6+ supports System.Index syntax for indexing from the end:
namespace Catalog.Queries;
public static class IndexExamples
{
public static Product? GetLastProduct(IEnumerable<Product> products)
=> products.ElementAtOrDefault(^1); // same as LastOrDefault() but index-based
public static Product? GetSecondToLast(IEnumerable<Product> products)
=> products.ElementAtOrDefault(^2);
}
Take(), Skip(), and Pagination
Pagination is one of the most common element-access patterns. Take(n) grabs the first n elements; Skip(n) discards the first n. Together they implement offset-based pagination with minimal ceremony.
namespace Store.Queries;
public record PagedQuery(int PageNumber, int PageSize);
public static class PagingExtensions
{
public static IEnumerable<T> Page<T>(this IEnumerable<T> source, PagedQuery query)
{
return source
.Skip((query.PageNumber - 1) * query.PageSize)
.Take(query.PageSize);
}
}
// Usage:
// var page2 = products.OrderBy(p => p.Name).Page(new PagedQuery(2, 20));
For database-backed queries via EF Core, Skip and Take translate directly to OFFSET / FETCH NEXT SQL clauses -- no in-memory buffering required. This connects naturally to CQRS query handlers where each query handler encapsulates one paged read operation.
TakeLast() and SkipLast()
TakeLast(n) returns the last n elements; SkipLast(n) discards the last n. These are useful when you care about the tail of a time-ordered or ranked sequence.
namespace Orders.Reports;
public static class OrderReports
{
public static IEnumerable<Order> GetLastNOrders(IEnumerable<Order> orders, int n)
{
return orders.OrderBy(o => o.PlacedAt).TakeLast(n);
}
public static IEnumerable<Order> ExcludeLatestN(IEnumerable<Order> orders, int n)
{
return orders.OrderBy(o => o.PlacedAt).SkipLast(n);
}
}
TakeLast and SkipLast buffer the entire sequence internally before yielding results -- they must read to the end before they know which elements qualify.
TakeWhile() and SkipWhile()
Unlike the fixed-count variants, these predicate-based operators stop (or start) based on a condition evaluated against each element.
namespace Inventory.Reports;
public static class StockReport
{
// Take products while in stock -- stops at the first out-of-stock item
public static IEnumerable<Product> TakeWhileInStock(IEnumerable<Product> products)
{
return products.TakeWhile(p => p.StockCount > 0);
}
// Skip over out-of-stock header products then return the rest
public static IEnumerable<Product> SkipOutOfStockPrefix(IEnumerable<Product> products)
{
return products.SkipWhile(p => p.StockCount == 0);
}
}
The key gotcha: TakeWhile and SkipWhile stop evaluating the predicate once it first fails. They do not skip or take elements scattered throughout the sequence -- only those forming a contiguous run at the beginning. If you need to filter elements anywhere in the sequence, use Where instead.
Chunk(size) -- .NET 6 Batch Processing
One of the most practically useful additions in .NET 6 is Chunk(size). It splits a sequence into T[] arrays of at most size elements. Before .NET 6, developers often reached for hacky Skip/Take loops or third-party libraries.
// Before .NET 6 -- manual batching with yield
namespace Notifications.Legacy;
public static class LegacyBatchSender
{
public static IEnumerable<IEnumerable<Order>> BatchOrders(IEnumerable<Order> orders, int batchSize)
{
var batch = new List<Order>(batchSize);
foreach (var order in orders)
{
batch.Add(order);
if (batch.Count == batchSize)
{
yield return batch.ToArray();
batch.Clear();
}
}
if (batch.Count > 0)
{
yield return batch.ToArray();
}
}
}
// .NET 6 -- Chunk()
namespace Notifications;
public static class BatchSender
{
public static async Task SendOrderNotificationsAsync(
IEnumerable<Order> orders,
INotificationService service,
CancellationToken ct)
{
foreach (var batch in orders.Chunk(50))
{
// batch is Order[] -- safe to pass to array-expecting APIs
await service.SendBatchAsync(batch, ct);
}
}
}
Chunk returns IEnumerable<T[]> -- each chunk is a materialized array. This is intentional: it ensures you won't accidentally enumerate the source multiple times per batch, and each chunk is safe to hand off to APIs that expect T[] or IReadOnlyList<T>.
This is especially handy when working with plugin-style architectures where different processors need to consume work in discrete chunks, or when building handlers that process commands in batches. It also pairs cleanly with the factory method pattern for constructing per-batch processor instances without leaking batch-management logic into the factory itself.
TryGetNonEnumeratedCount() -- .NET 6 Performance Optimization
TryGetNonEnumeratedCount answers the question: "do I know the count of this sequence without having to enumerate it?" It returns true and fills an out int count if the underlying type implements ICollection<T>, IReadOnlyCollection<T>, or similar optimizable interfaces. Otherwise it returns false without touching the sequence at all.
namespace Store.Diagnostics;
public static class CollectionDiagnostics
{
public static void PrintInfo<T>(IEnumerable<T> source)
{
if (source.TryGetNonEnumeratedCount(out int count))
{
Console.WriteLine($"Count (O(1)): {count}");
}
else
{
Console.WriteLine("Count unknown without enumeration.");
}
}
}
Where this really shines is in pre-allocation decisions:
namespace Store.Queries;
public static class MaterializationHelper
{
public static List<T> ToOptimizedList<T>(IEnumerable<T> source)
{
var list = source.TryGetNonEnumeratedCount(out int count)
? new List<T>(count) // pre-allocate exact capacity -- no resizing
: new List<T>();
list.AddRange(source);
return list;
}
}
Instead of allocating a default-capacity list and resizing it multiple times as elements stream in, you allocate exactly the right capacity up front -- cutting GC pressure on large sequences.
Choosing the Right LINQ Element-Access Operator
Here's a quick decision guide:
| Scenario | Operator |
|---|---|
| Need first match, empty = bug | First() |
| Need first match, empty = valid | FirstOrDefault() |
| Exactly one must exist | Single() |
| At most one should exist | SingleOrDefault() |
| Fixed positional access | ElementAt() / ElementAtOrDefault() |
| Offset-based pagination | Skip(n).Take(n) |
| Last N elements | TakeLast(n) |
| Content-based slicing | TakeWhile() / SkipWhile() |
| Batch processing | Chunk(n) (.NET 6) |
| Fast count check | TryGetNonEnumeratedCount() (.NET 6) |
Making smart operator choices becomes especially important in CQRS with feature slices because your query handlers should make their intent explicit -- is this handler expecting exactly one result, or just any result? The operator you choose communicates that invariant directly in code.
Real-World Example -- Paginated Product Catalog
Putting it all together into a realistic handler:
namespace Catalog.Features.Browse;
public record BrowseQuery(string? CategoryFilter, int Page, int PageSize);
public record BrowseResult(IReadOnlyList<Product> Items, int TotalCount, bool HasMore);
public sealed class BrowseProductsHandler
{
private readonly IProductRepository _repository;
public BrowseProductsHandler(IProductRepository repository)
{
_repository = repository;
}
public BrowseResult Handle(BrowseQuery query)
{
var source = _repository.GetAll();
if (query.CategoryFilter is not null)
{
source = source.Where(p => p.Category == query.CategoryFilter);
}
var ordered = source.OrderBy(p => p.Name);
// Use TryGetNonEnumeratedCount to avoid a second full scan when possible
if (!ordered.TryGetNonEnumeratedCount(out int totalCount))
{
totalCount = ordered.Count();
}
var items = ordered
.Skip((query.Page - 1) * query.PageSize)
.Take(query.PageSize)
.ToList();
return new BrowseResult(
items,
totalCount,
HasMore: (query.Page * query.PageSize) < totalCount);
}
}
TryGetNonEnumeratedCount avoids a second full enumeration if the repository returns a materialized list, while gracefully falling back to Count() when the underlying source doesn't support non-enumerated counting.
You can extend this with observer-pattern notifications when browsed results need to trigger downstream events, or connect category-based enum filtering when your product categories are modelled as C# enums.
Frequently Asked Questions
What is the difference between First() and Single() in LINQ?
First() returns the first matching element and ignores any duplicates in the sequence. Single() asserts exactly one match exists and throws InvalidOperationException if zero or more than one element matches. Use Single() when duplicate results would indicate a data integrity problem -- such as two records sharing what should be a unique identifier.
When should I use FirstOrDefault() vs SingleOrDefault()?
Use FirstOrDefault() when you just want one element and don't care if there are more. Use SingleOrDefault() when your domain logic says at most one result should ever exist (like a lookup by unique ID) and you want the runtime to enforce that invariant at query time rather than silently returning a random match.
How does Chunk() work in .NET 6?
Chunk(size) splits a sequence into T[] arrays of at most size elements. The last chunk may be smaller than size if the source length isn't evenly divisible. It returns IEnumerable<T[]>, so each chunk is already materialized and safe to iterate multiple times or pass to array-expecting APIs.
What does TryGetNonEnumeratedCount() do?
It checks whether the element count of a sequence can be determined without iterating over every element. If the underlying collection implements ICollection<T> or IReadOnlyCollection<T>, it returns true and provides the count via an out parameter in O(1). If not, it returns false without touching the sequence. This is useful for pre-allocating buffers and for paged result totals that would otherwise require a second full scan.
Does ElementAt() support indexing from the end?
Yes. In .NET 6+, you can use System.Index syntax: source.ElementAt(^1) returns the last element, source.ElementAt(^2) returns the second-to-last, and so on. For collections that implement IList<T>, this resolves to a direct index access without scanning the whole sequence.
Is it safe to call Last() on a large sequence?
It depends on the source type. On a plain IEnumerable<T>, Last() must scan every element -- O(n). On an IList<T> or IReadOnlyList<T>, LINQ short-circuits to a direct index access -- O(1). For EF Core IQueryable<T>, it translates to ORDER BY ... DESC FETCH NEXT 1 ROWS ONLY, which is efficient when the sort column is indexed.
When should I prefer TakeWhile() over Take()?
Use TakeWhile() when the stopping condition is based on element content rather than a fixed count. A common example is a time-ordered event log where you want events until you hit one older than a threshold: events.TakeWhile(e => e.Timestamp >= cutoff). Just remember it stops at the first failing element -- it does not skip non-matching elements scattered throughout the sequence.

