Each new .NET version brings performance enhancements, optimizations and new features to elevate productivity and application efficiency.
The release of .NET 9 is no exception, bringing a variety of enhancements, including performance improvements, new types and additional methods.
In today’s post, we’ll dive into the key advancements it offers.
LINQ Performance Improvements
When it comes to performance, .NET 9 truly raises the bar.
One of the standout enhancements is the optimization of LINQ methods. Not only have three new methods been introduced (covered in a previous blog post), but existing methods have also been significantly improved.
Fun fact: LINQ used to be significantly slower, which made it less popular but after so many improvements over the years, it’s incredible to see that they’ve managed to take yet another step forward.
In this blog post, I’ve highlighted a few improvements that I’m excited to share:
[Benchmark]
public bool Any() => _list.Any(i => i == 1000);
[Benchmark]
public bool All() => _list.All(i => i >= 0);
[Benchmark]
public int Count() => _list.Count(i => i == 0);
[Benchmark]
public int First() => _list.First(i => i == 999);
[Benchmark]
public int Single() => _list.Single(i => i == 0);
[Benchmark]
public object Chunk() => _values.Chunk(10);
[Benchmark]
public object Distinct() => _values.Distinct();
[Benchmark]
public object GroupJoin() => _values.GroupJoin(_values, i => i, i => i, (i, j) => i);
[Benchmark]
public object Join() => _values.Join(_values, i => i, i => i, (i, j) => i);
[Benchmark]
public object ToLookup() => _values.ToLookup(i => i);
[Benchmark]
public object Reverse() => _values.Reverse();
[Benchmark]
public object SelectIndex() => _values.Select((s, i) => i);
[Benchmark]
public object SelectMany() => _values.SelectMany(i => i);
[Benchmark]
public object SkipWhile() => _values.SkipWhile(i => true);
[Benchmark]
public object TakeWhile() => _values.TakeWhile(i => true);
[Benchmark]
public object WhereIndex() => _values.Where((s, i) => true);
[Benchmark]
public int DistinctFirst() => _arrayDistinct.First();
[Benchmark]
public int AppendSelectLast() => _appendSelect.Last();
[Benchmark]
public int RangeReverseCount() => _rangeReverse.Count();
[Benchmark]
public int DefaultIfEmptySelectElementAt() => _listDefaultIfEmptySelect.ElementAt(999);
[Benchmark]
public int ListSkipTakeElementAt() => _listSkipTake.ElementAt(99);
[Benchmark]
public int RangeUnionFirst() => _rangeUnion.First();
Some of the examples and insights were derived from blog post by Stephen Toub, be sure to check it out!
The results are truly remarkable:
| Method | Runtime | Mean | Error | Gen0 | Allocated|
|-----------------------------|---------|----------------|-----------|---------|----------|
| Any | .NET 8.0| 1,437.9598 ns | 9.6949 ns | 0.0038 | 40 B |
| Any | .NET 9.0| 230.9345 ns | 0.4150 ns | - | - |
| All | .NET 8.0| 1,477.3997 ns | 9.7939 ns | 0.0038 | 40 B |
| All | .NET 9.0| 232.1564 ns | 1.5826 ns | - | - |
| Count | .NET 8.0| 1,501.5078 ns | 9.1631 ns | 0.0038 | 40 B |
| Count | .NET 9.0| 231.4915 ns | 0.0644 ns | - | - |
| First | .NET 8.0| 1,503.1155 ns | 17.4794 ns| 0.0038 | 40 B |
| First | .NET 9.0| 230.2307 ns | 1.3547 ns | - | - |
| Single | .NET 8.0| 1,451.3925 ns | 9.1552 ns | 0.0038 | 40 B |
| Single | .NET 9.0| 314.4627 ns | 2.0166 ns | - | - |
| Method | Runtime | Mean | Error | Gen0 | Allocated|
|-----------------------------|---------|-----------------|-----------|---------|----------|
| Chunk | .NET 8.0| 8.9276 ns | 0.1921 ns | 0.0086 | 72 B |
| Chunk | .NET 9.0| 4.1036 ns | 0.0094 ns | - | - |
| Distinct | .NET 8.0| 8.0208 ns | 0.1922 ns | 0.0076 | 64 B |
| Distinct | .NET 9.0| 0.6480 ns | 0.0188 ns | - | - |
| GroupJoin | .NET 8.0| 16.3851 ns | 0.3764 ns | 0.0172 | 144 B |
| GroupJoin | .NET 9.0| 1.6817 ns | 0.0213 ns | - | - |
| Join | .NET 8.0| 17.2811 ns | 0.3776 ns | 0.0201 | 168 B |
| Join | .NET 9.0| 1.6912 ns | 0.0040 ns | - | - |
| ToLookup | .NET 8.0| 23.7971 ns | 0.2659 ns | 0.0153 | 128 B |
| ToLookup | .NET 9.0| 0.8833 ns | 0.0142 ns | - | - |
| Reverse | .NET 8.0| 6.9804 ns | 0.0502 ns | 0.0057 | 48 B |
| Reverse | .NET 9.0| 0.8627 ns | 0.0108 ns | - | - |
| SelectIndex | .NET 8.0| 9.9599 ns | 0.2294 ns | 0.0086 | 72 B |
| SelectIndex | .NET 9.0| 0.2757 ns | 0.0181 ns | - | - |
| SelectMany | .NET 8.0| 9.3093 ns | 0.0563 ns | 0.0076 | 64 B |
| SelectMany | .NET 9.0| 0.3501 ns | 0.0231 ns | - | - |
| SkipWhile | .NET 8.0| 9.9857 ns | 0.2089 ns | 0.0086 | 72 B |
| SkipWhile | .NET 9.0| 0.9013 ns | 0.0204 ns | - | - |
| TakeWhile | .NET 8.0| 9.7549 ns | 0.2031 ns | 0.0086 | 72 B |
| TakeWhile | .NET 9.0| 0.8606 ns | 0.0209 ns | - | - |
| WhereIndex | .NET 8.0| 10.6662 ns | 0.2564 ns | 0.0096 | 80 B |
| WhereIndex | .NET 9.0| 0.8848 ns | 0.0181 ns | - | - |
| Method | Runtime | Mean | Error | Gen0 | Allocated|
|-------------------------------|---------|-----------------|-----------|---------|----------|
| DistinctFirst | .NET 8.0| 63.9871 ns | 0.7090 ns | 0.0391 | 328 B |
| DistinctFirst | .NET 9.0| 7.3621 ns | 0.0076 ns | - | - |
| AppendSelectLast | .NET 8.0| 3,676.9429 ns | 27.1412 ns| 0.0153 | 144 B |
| AppendSelectLast | .NET 9.0| 2.0945 ns | 0.0186 ns | - | - |
| RangeReverseCount | .NET 8.0| 8.5494 ns | 0.1063 ns | - | - |
| RangeReverseCount | .NET 9.0| 4.9830 ns | 0.0200 ns | - | - |
| DefaultIfEmptySelectElementAt | .NET 8.0| 3,646.2431 ns | 24.1907 ns| 0.0153 | 144 B |
| DefaultIfEmptySelectElementAt | .NET 9.0| 4.9853 ns | 0.0043 ns | - | - |
| ListSkipTakeElementAt | .NET 8.0| 4.7446 ns | 0.0286 ns | - | - |
| ListSkipTakeElementAt | .NET 9.0| 2.5036 ns | 0.0083 ns | - | - |
| RangeUnionFirst | .NET 8.0| 56.5613 ns | 0.6412 ns | 0.0411 | 344 B |
| RangeUnionFirst | .NET 9.0| 5.4611 ns | 0.0264 ns | - | - |
Interestingly: A 50% performance boost, while substantial, almost feels routine when compared to optimizations exceeding 1000 times.
The craziest part? No memory allocation.
But how were these incredible improvements achieved?
There have been many improvements, but one that stands out is the enhancement to the Iterator<T>. For instance, it's now used more efficiently, allowing it to be created just once in some cases, as opposed to the previous method where multiple iterators were needed in certain situations.
Another key improvement in LINQ is the increased use of Spans in internal methods, resulting in better performance without triggering memory allocations.
You can also take advantage of this in your own code by using the AsSpan method from Collections Marshal, which accesses the internal array backing each list and allows you to create a Span directly from it.
Exceptions Performance Improvements
.NET 9 has also made significant performance improvements in handling exceptions.
Since exceptions are frequently used for flow control in large applications, which may throw millions of exceptions, this enhancement can lead to a substantial performance boost.
Here are the results I've got:
| Method | Runtime | Mean | Error | Gen0 | Allocated |
|------------|-----------|------------|-----------|--------|-----------|
| BadMethod | .NET 8.0 | 4.076 us | 0.0344 us | 0.0229 | 224 B |
| BadMethod | .NET 9.0 | 1.695 us | 0.0128 us | 0.0248 | 216 B |
NOTE: This is not an encouragement to use exceptions more than necessary, but rather a presentation of the improvements that have been made.
For an alternative approach, take a look at my blog on the Result Pattern.
P.S. If you're interested in learning how to conduct your own benchmarks, check out this blog post.
Lock Type
We now have a concrete type for locking, transitioning from using a locking object to the new System.Threading.Lock type.
private static readonly Lock Lock = new();
public void LockingInDotNet9()
{
lock (Lock)
{
// PS: You can't use await inside!
}
}
The lock statement now detects when the target is a Lock object. In such cases, it leverages the updated API instead of the traditional System.Threading.Monitor API.
Additionally, the compiler identifies scenarios where a Lock object is converted to another type, generating Monitor-based code accordingly.
Time Ordered GUIDs
Version 7 GUIDs can now be generated using the new Guid.CreateVersion7() and Guid.CreateVersion7(DateTimeOffset) methods.
Additionally, the Version property provides access to the version field of a GUID object.
// To learn more checkout: https://www.nikolatech.net/blogs/guids-vs-ulids
var guidV7 = Guid.CreateVersion7();
var guidV7WithTimestamp = Guid.CreateVersion7(DateTimeOffset.UtcNow);
01934991-207a-7f49-99f7-23fe10eb07d3
01934991-207a-7f5a-a673-71a48445b3ed
I’ve also covered this topic in a dedicated blog post, check it out!
Feature Switches
Feature switches are an excellent addition, enabling conditional inclusion or exclusion of functionality during builds.
This approach enhances app performance and reduces size, especially when leveraging trimming or Native AOT compilation.
FeatureSwitchDefinitionAttribute: Treats a feature switch property as a constant during trimming. Code protected by the switch is removed if the feature is disabled.
FeatureGuardAttribute: Marks a feature-switch property as a safeguard for code annotated with RequiresUnreferencedCodeAttribute, RequiresAssemblyFilesAttribute or RequiresDynamicCodeAttribute.
To define switches in your project you need to update yourW .csproj file:
<ItemGroup>
<RuntimeHostConfigurationOption Include="Feature.IsSupported" Value="true" Trim="true" />
</ItemGroup>
internal sealed class Feature
{
[FeatureSwitchDefinition("Feature.IsSupported")]
internal static bool IsSupported =>
AppContext.TryGetSwitch("Feature.IsSupported", out bool isEnabled) ? isEnabled : true;
internal static void Implementation() =>
Console.WriteLine("Feature is supported");
}
if (Feature.IsSupported)
{
Feature.Implementation();
}
This leads to several benefits:
- Reduce Application Size: Exclude unused code, resulting in smaller binaries.
- Enhance Performance: Minimize the application's footprint for faster load times and improved efficiency.
- Customize Builds: Tailor applications to include only necessary features for specific deployment scenarios.
Task.WhenEach
Task.WhenEach is definitely one of my favorite new features.
Now, we can conveniently handle tasks as soon as they finish. When tasks run independently, the most efficient approach is to start processing them immediately after completion.
WhenEach returns an IAsyncEnumerable, enabling you to use await foreach to handle tasks as they complete.
Using WhenEach for this scenario is as simple as it gets:
// With .NET 9 utilizing Task.WhenEach really simplifies logic
await foreach (var task in Task.WhenEach(tasks))
{
Console.WriteLine(await task);
}
Before .NET 9, you had to repeatedly call Task.WaitAny in a loop to retrieve the next completed task, or rely on third-party libraries. Constantly removing tasks and checking them in a rapid sequence, as you can imagine, wasn't an optimal approach for performance.
// Equivalent code before Task.WhenEach
while (tasks.Any())
{
var readyTask = await Task.WhenAny(tasks);
tasks.Remove(readyTask);
Console.WriteLine(await readyTask);
}
Json Schema Exporter
The JsonSchemaExporter class enables you to generate JSON schema documents from .NET types using a JsonSerializerOptions or JsonTypeInfo instance.
The resulting schema specifies the JSON serialization contract for the .NET type, detailing the structure of the data that can be serialized and deserialized.
You can customize the schema output by configuring the JsonSerializerOptions or JsonTypeInfo instance passed to the GetJsonSchemaAsNode method.
var schema = JsonSchemaExporter.GetJsonSchemaAsNode(
JsonSerializerOptions.Web, typeof(Order));
Console.WriteLine(schema.ToJsonString());
public class Order
{
public Guid Id { get; set; }
public string SerialNumber {get; set; }
public DateTime CreatedAt {get; set; }
public DateTime ModifiedAt {get; set; }
}
{
"type":[
"object",
"null"
],
"properties":{
"id":{
"type":"string",
"format":"uuid"
},
"serialNumber":{
"type":"string"
},
"createdAt":{
"type":"string",
"format":"date-time"
},
"modifiedAt":{
"type":"string",
"format":"date-time"
}
}
}
Params
From .NET 9, the params modifier isn't limited to array types anymore.
void ProcessOrders(params IEnumerable<Order> orders)
{
// Processing orders...
}
You can now use params with any recognized collection type, including Span<T>, ReadOnlySpan<T>, and types that implement IEnumerable<T> and have an Add method.
Conclusion
In conclusion, .NET 9 introduces significant improvements in both performance and functionality.
The optimizations in LINQ, such as faster iteration and reduced memory allocations, greatly enhancing the performance of common operations.
Additionally, improvements in exception handling and the introduction of a new Lock type further enhance efficiency.
The ability to generate Time-Ordered GUIDs, utilize feature switches, leverage Task.WhenEach and more, certainly make migrating to .NET 9, including its STS, worthwhile.
If you want to conduct additional testing or check out other enhancments you can find the source code here:
Source CodeI hope you enjoyed it, subscribe and get a notification when a new blog is up!
