Unreal Horde vs. Traditional CI/CD: Optimizing for Unreal Engine Development
Author
Maxime Roth Fessler
Date Published
As game development scales in complexity, Continuous Integration and Continuous Delivery (CI/CD) pipelines have become central to maintaining productivity. Conventional platforms such as Jenkins, Github CI, GitLab CI, or CircleCI offer flexibility and maturity for typical software delivery. However, their architectures are not optimized for the specific computational and data demands of Unreal Engine projects. Compiling millions of lines of C++ code and processing terabytes of binary assets requires an orchestration layer purpose-built for this environment.
Unreal Horde, developed by Epic Games, extends CI/CD beyond the general-purpose model. It introduces Granular Parallelization, Native Caching, and Deep Build Graph integration, three design principles that align with the technical and operational realities of Unreal Engine development. The following sections examine each of these principles in detail, highlighting how they address the performance, scalability, and maintenance challenges that traditional CI/CD tools face in large Unreal Engine environments.
Granular Parallelization for Build Speed
The primary technical challenge in Unreal Engine development is the sheer computational intensity of the build process, particularly the enormous C++ compilation phase. The rate at which the pipeline completes this process directly governs iteration speed and developer efficiency. Traditional CI tools eventually reach a hard performance ceiling, while Horde achieves a level of scalability that translates into exponential speed gains.
The Bottleneck of Job-Level Parallelization
Traditional CI/CD systems such as Jenkins are built around Job-Level Parallelization. They can effectively distribute independent, self-contained tasks (for example, Run Unit Tests or Build iOS Client) to separate Build Agents. However, when confronted with a single, monolithic process like compiling the engine and game code, a generic tool can only assign the job to one high-spec machine. That machine quickly becomes a bottleneck, leaving hundreds of available CPU cores idle and developers waiting for builds to finish.

Jenkins uses job-level parallelization, where each agent runs an entire stage (Build, Test, or Deploy).
Horde’s Breakthrough: Task-Level Distribution
Horde eliminates this inefficiency by operating at a Task-Level granularity, made possible through its native understanding of Unreal Engine’s Build Graph system.
- Build Graph Decomposition: When a build begins, Horde reads the Build Graph script and decomposes it into thousands of small, non-sequential, interdependent tasks.
- Distributed Execution: Instead of assigning an entire job to a single agent, Horde distributes these atomic tasks across a scalable pool of Horde Agents, leveraging the elastic capacity of AWS. Independent C++ modules, for example, can be compiled simultaneously across hundreds of machines.
Rather than one server performing the entire build, a hundred servers can each process a fraction of the workload in parallel. This granular parallelization fully utilizes available compute power, transforming multi-hour builds into streamlined processes. For large Unreal Engine projects, it becomes a decisive architectural advantage by removing wait times, shortening feedback loops, and sustaining the development team’s momentum.

Unreal Horde distributes a single large job into many smaller tasks. These tasks are dynamically assigned across multiple agents to maximize efficiency and reduce build times.
Managing Large Assets Efficiently (Caching and Monorepos)
A major challenge for general-purpose CI/CD systems in game development is managing data volume. Modern Unreal Engine projects operate with multi-terabyte monorepos containing extensive binary assets, textures, and cache files. In such environments, maintaining efficient data synchronization and artifact management is essential to sustain fast build times.
The Challenge of Redundant Processing
Traditional CI/CD systems such as Jenkins often process the same data multiple times across different build agents. Each agent independently downloads, compiles, and processes the same files, lacking a unified caching layer. This redundancy increases network traffic, storage I/O, and compute usage (particularly problematic when handling gigabytes of cooked game assets).
Horde’s Integrated Approach: Content Addressable Storage (CAS) / Zen
Horde addresses this limitation through its native Content Addressable Storage (CAS), also known as Horde Storage or Zen, a system designed specifically for Unreal Engine’s scale and data characteristics.
- Centralized Artifact Repository: Each compiled code object, cooked asset, or cache file is assigned a unique cryptographic hash and stored in the central CAS repository.
- Elimination of Redundant Work: Once an artifact has been built by any Horde Agent, it becomes instantly available to all others. Subsequent builds simply fetch the cached version instead of reprocessing it.
This model minimizes unnecessary network and compute activity, ensuring consistent performance even as project size grows. When deployed on AWS, CAS benefits from services like Amazon S3 for durable, cost-effective storage, providing a native caching framework that general-purpose CI tools would require significant custom engineering to approximate.
Maintenance Overhead
Replicating Horde’s caching efficiency in other CI/CD platforms typically requires a complex stack of additional components: network file shares (such as NFS), custom caching proxy servers, or paid third-party plugins. These integrations add latency, introduce potential points of failure, and require ongoing maintenance. Over time, the cumulative complexity and operational cost make it difficult for a general-purpose CI tool to manage the scale and data patterns of modern Unreal Engine projects effectively.
Native Integration for Reduced Maintenance and Greater Control
One of Horde’s most significant advantages is the reduction in ongoing maintenance effort required to support Unreal Engine development. By aligning natively with Unreal’s ecosystem, Horde simplifies operations and allows engineering teams to focus on building features rather than maintaining infrastructure.
The Build Graph Translation Overhead
CI/CD platforms such as Jenkins must interpret Unreal’s Build Graph scripts, which are the XML-based recipes that define how projects are compiled, packaged, and deployed. To do so, engineers often create intermediary wrapper scripts that translate between Jenkins’ general-purpose command model and Unreal Build Tool (UBT) operations. This middle layer is fragile, requires frequent updates when Unreal versions change, and introduces additional maintenance overhead.
Horde’s Native Execution Advantage
Horde eliminates the need for this translation entirely. Developed by Epic Games, it interprets and executes Build Graph scripts directly, understanding all dependencies, relationships, and execution rules without intermediary scripting.
- Direct Execution: Horde runs Build Graph instructions natively, ensuring full compatibility and minimal setup.
- Zero Maintenance Debt: Its tight integration with Unreal ensures stability across engine updates, removing the need for continuous adjustments.
This native alignment shifts engineering effort away from maintaining CI/CD plumbing toward optimizing build performance and scalability. The result is a cleaner, more predictable pipeline that supports faster iteration and consistent delivery across projects.
Summary Comparison Table
Traditional CI platforms remain highly effective for standard software delivery pipelines. However, the computational and data intensity of Unreal Engine development benefits from a system architected around the engine itself. Horde’s native integration, distributed architecture, and caching capabilities deliver measurable gains in build performance, maintainability, and scalability, particularly when deployed on AWS.
Category | General Purpose CI/CD (Jenkins, GitLab CI, CircleCI) | Unreal Horde (Purpose-Built for UE) |
Parallelization Model | Job-level (entire tasks per agent) | Task-level (fine-grained Build Graph tasks) |
Scalability | Limited by available agent capacity | Horizontally scales across hundreds of agents |
Caching & Artifact Management | Requires external plugins or shared drives | Native Content Addressable Storage (CAS/Zen) |
Unreal Build Graph Support | Requires wrapper scripts and manual integration | Native interpretation and execution |
Maintenance Overhead | High (due to scripting and plugin dependencies) | Minimal (engine-native compatibility) |
Build Speed | Slower for large C++ and asset builds | Significantly faster through distributed execution |
Best Suited For | General software development | Large-scale Unreal Engine projects |
Conclusion
Selecting the right CI/CD platform depends on the nature of the workload. Jenkins offers proven versatility for multi-language, multi-stack projects. For Unreal Engine, however, where build complexity, asset volume, and iteration speed define competitiveness, Unreal Horde provides a specialized, scalable, and maintainable foundation.
When combined with the elasticity of AWS infrastructure, Horde enables game studios to maintain continuous delivery performance at scale, supporting larger teams, faster feedback loops, and smoother release cycles for ambitious Unreal Engine titles.
To explore the financial and operational advantages of running Horde on AWS, refer to the companion article Modernizing Unreal Horde CI/CD: Moving from On-Premise Infrastructure to AWS.
Related Posts

Modernizing Unreal Horde CI/CD: Moving from On-Premise Infrastructure to AWS
Modernizing Unreal Horde CI/CD on AWS to improve scalability, reduce costs, and accelerate game development workflows.

Setting Up an Efficient CI/CD Pipeline with Terragrunt, AWS, and either GitHub Actions or CircleCI – 2 Tutorials

