Post

CLR · Mono · IL2CPP · NativeAOT — Comparing the Runtime Branches

CLR · Mono · IL2CPP · NativeAOT — Comparing the Runtime Branches
Prerequisites — Read these first
TL;DR — Key Takeaways
  • .NET runtimes broadly split into JIT (CLR · CoreCLR · Mono) and AOT (IL2CPP · NativeAOT · Mono Full AOT) families
  • AOT runtimes have advantages in startup time and deployment size, but break features like Reflection.Emit, dynamic generic instantiation, and Expression.Compile
  • The constraints game developers hit with IL2CPP stem from deliberate runtime design decisions — they are not Unity-specific problems
Visitors

Hits

Introduction: Same IL, Different Fate

The previous two episodes established that the Runtime layer of the .NET stack is divided into multiple implementations. This episode examines those implementations one by one.

At this point in time, the five runtimes with practical relevance are:

RuntimeBelongs toIntroducedStatus
CLR.NET Framework2002Frozen (4.8.1)
CoreCLR.NET 5+2016 (Core 1.0)Active
MonoXamarin · Unity2004Active (Unity fork)
IL2CPPUnity2014Active
NativeAOT.NET 7+2022Active

Even when you write the same C# code, which runtime it runs on dramatically affects performance, memory, available APIs, and deployment size. The goal of this episode is to distill those differences into practical selection criteria.

Five runtimes may look complex, but grasping one single axis brings most of it into focus. That axis is JIT vs AOT.


Part 1. The Single Axis — JIT vs AOT

Episode 1 mentioned that there are two points in time when IL can be translated into native code. This translation timing is the decisive factor that splits the five runtimes into two camps.

5 .NET Runtimes — JIT / AOT Classification IL (intermediate bytecode) JIT Family Translate at runtime AOT Family Translate at build time CLR .NET Framework CoreCLR .NET 5+ Mono Unity default IL2CPP Unity iOS/WebGL NativeAOT .NET 7+ Mono Full AOT Xamarin iOS

JIT Family — CLR · CoreCLR · Mono

JIT (Just-In-Time) translates IL to native code on the machine the app is running on, at the moment it runs. The trade-offs are as follows.

Advantages

  • Can pull hardware information from the actual machine and optimize accordingly
  • Supports follow-up re-optimization using runtime execution statistics (Tiered Compilation, PGO)
  • Runtime code generation APIs like Reflection.Emit·Expression.Compile work

Disadvantages

  • JIT cost is paid at startup (slow Cold Start)
  • The runtime must be installed on the target machine
  • The JIT itself consumes memory and CPU

AOT Family — IL2CPP · NativeAOT · Mono Full AOT

AOT (Ahead-Of-Time) translates IL to native code before the app is deployed, on the developer’s build machine.

Advantages

  • Extremely fast Cold Start — the translation cost has already been paid
  • The only option on platforms that do not allow JIT (iOS, consoles, WebAssembly)
  • No runtime installation required at deployment (NativeAOT ships as a single binary)

Disadvantages

  • Cannot create new code at runtimeReflection.Emit breaks
  • Dynamic generic instantiation is restricted → cannot create a new List<MyRuntimeType> at runtime
  • Longer build times — all IL is translated upfront
  • All generic instantiations must be pre-generated → larger deployment binary

This single table is the foundation for every comparison that follows.


Part 2. The Five Runtimes

CLR — The .NET Framework Runtime

  • Released: 2002
  • Platform: Windows only
  • Compilation: JIT
  • Status: Frozen. .NET Framework 4.8.1 (2022) was the final release
  • Notable: Tightly coupled with Windows-only upper frameworks such as WPF, WinForms, and WCF

There is no reason to choose CLR for new development. It is only relevant for maintaining legacy systems.

CoreCLR — The Main Runtime of Modern .NET

  • Released: 2016 (.NET Core 1.0), absorbed into .NET 5+ from 2020
  • Platform: Windows · Linux · macOS · FreeBSD
  • Compilation: Tiered JIT (Tier 0 fast initial translation → Tier 1 optimized recompilation)
  • Notable: Supports PGO (Profile-Guided Optimization), aggressively optimizing hot code using execution statistics

CoreCLR is a runtime that mitigates the JIT disadvantage (startup cost) with Tiered Compilation. It performs only a fast Tier 0 translation at startup, then later recompiles frequently called hot code to Tier 1. (Microsoft Learn — CLR overview)

It is the default runtime for server, web, desktop, and WASM — and the most actively evolving one.

Mono — The Original Cross-Platform Runtime

  • Released: 2004
  • Platform: Windows · Linux · macOS · iOS · Android · WebAssembly
  • Compilation: JIT by default, Full AOT mode also available (for environments where JIT is forbidden, such as iOS)
  • Notable: Small footprint. Well-suited for mobile, embedded, and game engines

As seen in episode 2, Mono started as an external open-source project and eventually became a Microsoft-official implementation. In 2024, Microsoft transferred ownership to WineHQ and the upstream moved into maintenance mode, but Unity runs its own fork.

Selecting Scripting Backend: Mono in Unity uses this runtime for the editor and desktop builds.

IL2CPP — Unity’s AOT Pipeline

  • Released: 2014
  • Platform: iOS · WebGL · consoles (PS5 · Xbox · Switch) · Android · Windows · macOS
  • Compilation: AOT only. Converts IL to C++ code, then produces a native binary using the platform’s C++ toolchain (Xcode · Emscripten · console SDK)
  • Notable: Reflection.Emit is forbidden, generic instantiation is restricted, build times increase

The reason IL2CPP exists can be summarized in one sentence: “Because iOS, WebGL, and consoles do not allow JIT, and Unity chose to solve the performance and constraint problems that Mono Full AOT could not address with its own AOT pipeline.” (Unity Manual — IL2CPP overview) The internal workings are explained with real examples in Unity’s own “An introduction to IL2CPP internals” blog series.

NativeAOT — Microsoft’s Server and Cloud AOT

  • Released: 2022 (.NET 7, console apps and library support — .NET Blog — “Announcing .NET 7” (2022.11.08))
  • 2023 (.NET 8, expanded ASP.NET Core support)
  • Platform: Windows · Linux · macOS · iOS (experimental) · Android (experimental)
  • Compilation: AOT only. Compiles IL directly to native code (no C++ intermediary)
  • Notable: Deployable as a single native binary, no runtime installation required, extremely fast startup

NativeAOT targets containers, serverless, and CLI tools. The motivation differs from why game developers use IL2CPP (because the platform forbids JIT). The journey from experimental to official release is detailed in “Announcing .NET 7 Preview 3”. (Microsoft Learn — Native AOT deployment)


Part 3. Runtime Comparison Matrix

The five runtimes compared side by side along the same axes.

AxisCLRCoreCLRMonoIL2CPPNativeAOT
CompilationJITTiered JITJIT (+Full AOT option)AOT onlyAOT only
Cross-platformWindowsWin/Lin/MacWideAll Unity-supported platformsWin/Lin/Mac
Cold StartSlowMedium (Tier 0 is fast)MediumFastFastest
Re-optimization at runtimeNoneYes (PGO)LimitedNoneNone
Reflection.EmitOOOXX
Expression.CompileOOOInterpreted modeInterpreted mode
Dynamic generic instantiationOOORestrictedRestricted
Runtime install requiredOO (or Self-contained)OX (bundled in engine)X
Deployment sizeSmall (runtime separate)MediumMediumLarge (includes engine)Medium
Build timeFastFastFastVery slowSlow
Primary useLegacy WindowsServer · web · desktopUnity editor · desktopUnity mobile · consoleServerless · CLI

Three Things to Take Away from This Table

① The two AOT runtimes (IL2CPP and NativeAOT) share the same constraints. Reflection.Emit, Expression.Compile, and dynamic generics are all features that depend on JIT. In an AOT environment there is fundamentally no engine to generate new IL at runtime.

② Cold Start is overwhelmingly in favor of AOT. JIT is forbidden on iOS for security reasons (the W^X memory principle), but AOT’s fast startup is also a decisive advantage for serverless and CLI tools. Running dotnet run no longer requires paying hundreds of milliseconds in JIT cost every time.

③ CoreCLR’s Tiered JIT is a middle ground. It cannot eliminate JIT cost entirely, but it “avoids the worst and pursues the best” by translating quickly at Tier 0 and only re-optimizing frequently called code at Tier 1. This is why CoreCLR remains the default for server and web workloads.


Part 4. The IL2CPP Build Pipeline in Detail

The description “IL2CPP converts IL to C++ and then compiles to native” can sound abstract. Here is the actual build pipeline visualized.

IL2CPP Build Pipeline — From IL to Native Binary C# Source .cs Roslyn C# → IL IL .NET Assemblies il2cpp.exe IL → C++ Platform Toolchain Xcode / Emscripten Console SDK 1. Authoring 2. IL compile 3. IL artifact 4. C++ conversion (Unity) 5. Native build

Why Insert C++ in the Middle

A compiler that goes directly from IL to native code is theoretically possible — that is exactly what NativeAOT does. Yet Unity chose the two-step IL → C++ → native path. The rationale is explained with real generated C++ examples in Unity’s own “IL2CPP Internals: A tour of generated code” blog post. The summary is as follows.

① Reuse of platform-specific C++ toolchains iOS uses Xcode LLVM, WebGL uses Emscripten, consoles use each manufacturer’s SDK, Android uses NDK — every platform already has a top-tier, highly optimized C++ compiler. By converting IL to C++ alone, the remaining optimization is handled by the platform toolchain. Achieving the same quality without this approach would have required Unity to develop and maintain a separate backend for every platform.

② Access to platform-specific features The C++ intermediary integrates naturally with each platform’s native libraries and SDK. Building a direct AOT compiler would have made this integration far more complex.

③ Debuggability When a runtime crash occurs in an IL2CPP build, the generated C++ code can be read. This is far easier to trace than pure binary output.


Part 5. Five Constraints of AOT Environments

These are the key constraints of NativeAOT as stated in the official Microsoft documentation. IL2CPP shares most of the same constraints. (Microsoft Learn — Native AOT limitations)

Reflection.Emit Is Forbidden

Symptom: Code that dynamically creates methods or types at runtime using System.Reflection.Emit does not execute.

Cause: An AOT environment has no JIT to receive IL and translate it to native code at runtime. Emit is an API for producing IL, but without a translator to receive it, it cannot function.

Impact: Many serialization libraries (some paths in legacy Newtonsoft.Json), fast proxy generation (Castle DynamicProxy), and dynamic constructor injection in DI containers either break or slow down.

Alternative: Source Generator. Generating the required code at compile time eliminates the need for runtime Emit. System.Text.Json has moved in this direction and is AOT-friendly.

Expression.Compile Falls Back to Interpreted Mode

Symptom: LINQ queries or Expression<Func<T>>.Compile() execute in interpreter mode. They are not as fast as compiled native code.

Cause: Expression compilation works by generating IL at runtime and JIT-compiling it — impossible in an AOT environment.

Impact: Performance may degrade for ORM (some EF Core paths) and repeatedly invoked LINQ-to-Expression code.

Alternative: Pre-convert frequently executed expressions to delegates ahead of time. Alternatively, evaluate Source Generator-based replacement libraries.

③ Dynamic Generic Instantiation Is Restricted

Symptom: Constructing generic combinations that do not appear in the source code at runtime — such as Type.MakeGenericType(typeof(List<>), runtimeType) — fails or throws an error.

Cause: The AOT compiler pre-generates all generic instances at build time. Combinations that did not exist at build time have no corresponding native code.

Impact: The common pattern of optimizing a Dictionary<string, object> to Dictionary<string, RuntimeType> based on a runtime type breaks.

Alternative: Use the generic combination explicitly at build time at least once (a “hint” such as _ = new List<MyType>()) or work around it with a non-generic version.

④ Reflection Interacts with the Trimmer

Symptom: String-based reflection such as Type.GetMethod("SomeMethod") fails unexpectedly — because the trimmer determined that the method was unused and removed it.

Cause: AOT deployment requires trimming (Trimming). It removes unused code from the build output to reduce binary size, but string-based references cannot be statically analyzed.

Impact: Many older libraries throw runtime errors in AOT builds.

Alternative: Use the DynamicDependency attribute to hint the trimmer, or remove the reflection with a Source Generator.

⑤ Deployment Binary Size Increases

Symptom: An AOT build bundles all generic instances, runtime libraries, and dependencies into a single binary, making the file larger than a framework-dependent JIT build.

Cause: “Self-contained” is the default. Instead of requiring a runtime installation, the app carries everything with it.

Impact: Larger mobile app install size, larger container image size, longer deployment times.

Alternative: Aggressive trimming, PublishTrimmed=true, and disabling unnecessary feature flags.


Part 6. Runtime Decision Guide

A concise decision tree for which runtime to choose by project type.

Building a server or web APICoreCLR (.NET 8+). For high-load, low-latency, or fast-deployment requirements, consider NativeAOT — but always verify AOT constraints first.

Building a CLI tool or serverless functionNativeAOT. Cold Start is critical and the dependency footprint is small enough to accept AOT constraints.

Building a game with Unity → Editor and desktop builds use Mono. iOS, WebGL, and console builds use IL2CPP (mandatory). Desktop builds can also switch to IL2CPP for a performance boost.

Starting a new Windows desktop appCoreCLR + WPF/WinForms on .NET 8+. Avoid CLR (.NET Framework).

Maintaining a legacy .NET Framework systemCLR. Plan a gradual migration of new feature development to .NET 8+.

Building a mobile app (non-Unity) → After Xamarin’s end of support in 2024, .NET MAUI is the official path. Internally it uses a mix of Mono and NativeAOT.


Summary

Four lines that capture the core of this episode.

  1. .NET runtimes split into JIT and AOT families, and this single axis determines most of the performance characteristics, constraints, and deployment size.
  2. AOT constraints are design choices, not platform constraints. The breakage of Reflection.Emit, dynamic generics, and Expression.Compile is a consequence of having no JIT at runtime — this is common to both IL2CPP and NativeAOT.
  3. The reason IL2CPP takes the IL → C++ → native two-step is to reuse the high-level optimization already built into each platform’s C++ toolchain.
  4. The constraints game programmers encounter in Unity (Reflection.Emit, generic pitfalls, trimmer issues) are the inevitable consequence of runtime design, and compile-time metaprogramming such as Source Generators is the modern solution.

Closing the Foundation Series

Across three episodes we explored .NET’s map (ep. 1) → history (ep. 2) → runtime branches (ep. 3). These three episodes serve as the common coordinate system for every C# series that follows.

The next series is the async series (6 episodes). The JIT/AOT context covered today connects naturally to why UniTask is better suited to Unity than Task, how async/await is transformed under IL2CPP, and why Source Generators that avoid Reflection.Emit matter.


References

Primary Sources — Official Announcements and Technical Analysis

Reference Documentation

This post is licensed under CC BY 4.0 by the author.