Producing Events & Videos in Virtual Sets
Technologie

Producing Events & Videos in Virtual Sets

With the combination of Unreal Engine 5 and Aximmetry, we can revolutionize the way we create and present virtual content. In this article, we explain the key terms, show you how we approach productions and which tools we use.

With the combination of Unreal Engine 5 and Aximmetry, we can revolutionize the way we create and present virtual content. In this article, we explain the key terms, show you how we approach productions and which tools we use.

Preparing a Production in Virtual or Mixed Reality Sets

We want to inspire our clients and their audiences with unique and impressive content. And with Unreal Engine 5 and Aximmetry, we can continually push the boundaries of what’s possible in virtual production. Virtual production is more than just a trend - in many ways, it’s the future of media production, and we’re ready to shape that future.

Once the conception phase of an event or film production is complete, we proceed in the creation phase as follows:

Step 1: Modeling and Texturing in Blender

Our process begins in Blender, an open-source 3D software where we create our 3D models and textures. Blender gives us the freedom and flexibility we need to bring our ideas to life. We can create shapes and structures that match our vision exactly. After creating our model, we bring it to life with textures that add detail. These textures can simulate colors, patterns and even the feel of a specific material.

An important aspect of working with Blender is optimizing models for real-time operation. Unlike offline renderings where computation time doesn’t matter, virtual sets must be displayed in real time. This means: polygon count must strike a balance between detail and performance. We use techniques like Level of Detail (LOD), where objects in the distance are rendered with fewer polygons while foreground elements retain full fidelity.

Step 2: Creating Virtual Spaces in Unreal Engine

After creating our models and textures, we transfer them to Unreal Engine. Unreal Engine 5 is one of the most powerful game engine technologies currently available, enabling us to create stunning virtual spaces and animations. We can add lighting effects, weather conditions and much more to create an immersive environment. Thanks to Unreal Engine 5’s Nanite technology, we can even render billions of polygons in a scene without impacting performance.

Combined with our modeling and texturing in Blender, we can create a nearly unlimited variety of scenes and environments.

Unreal Engine also offers Lumen, a revolutionary global illumination system that calculates light reflections and refractions in real time. For virtual productions, this means: lighting changes in the scene - for example when a spotlight is panned or a virtual door is opened - look physically correct and natural. This is a massive quality leap compared to older rendering methods where lighting changes had to be pre-calculated.

Step 3: Camera Tracking - Matching Virtual and Physical Cameras

A crucial component of our process is camera tracking. By capturing the movement of our real camera and transferring this data to the virtual camera in Unreal Engine 5, we can establish a seamless connection between our real world and the virtual environment.

This process not only increases the realism of our scenes but also provides us with the flexibility to realize a wide variety of camera movements. Furthermore, camera tracking opens up new creative possibilities that we wouldn’t have otherwise.

Step 4: Final Composite in Aximmetry

Finally, we use Aximmetry to combine our virtual elements with real footage. Aximmetry is a powerful software for live production that helps us create a final composition that does justice to both our virtual and real elements.

Key Technologies in Our Virtual Productions

We’d like to go into more detail about two technologies and their advantages for our creation and production workflows.

Camera Tracking: Matching Real and Virtual Cameras

Camera tracking, also known as Match Moving, is a process where the movement of a real camera is captured in 3D and transferred to our virtual camera in Unreal Engine. This allows us to ensure a coherent perspective between our real footage and the virtual environment.

Why invest significant time and effort in camera tracking? Because camera tracking enables us to create a seamless connection between the real world and our virtual scene. This in turn gives us creative freedom in terms of image composition and camera movements.

Advantages: Realism and Absolute Flexibility in Scene Composition

It creates realism. When our real camera moves, the virtual camera moves too, causing the perspective in our virtual scene to constantly adjust. This ensures that the interaction between our real elements and the virtual environment looks realistic and convincing.

Camera tracking also offers us more flexibility. Whether we’re performing a simple pan or a complex camera move, camera tracking ensures that our virtual scene is always correctly aligned. This allows us to realize a wide variety of shots and movements without having to worry about whether they match our virtual scene.

Aximmetry Brings All Components Together

Aximmetry is a crucial tool in our virtual production process. With Aximmetry, we can combine our scenes created in Unreal Engine 5 with real video footage. Additionally, it offers us the ability to create real-time graphics and animations and insert them into our live videos. Thanks to the integrated keyer, we can process greenscreen footage in real time and replace the background with our virtual scene. We can also simulate camera movements and adjust the perspective of our virtual scene accordingly to create a truly immersive experience.

Aximmetry is more than just software. It’s a powerful ally that enables us to go beyond the limits of the physical world and create impressive virtual scenarios. But why is Aximmetry so important for our productions?

Advantages: Costs Under Control and Absolute Flexibility, Even During Production

On one hand, Aximmetry enables us to achieve impressive visual effects that would be difficult or impossible to achieve with traditional methods. It also helps reduce costs for physical sets and travel. We can create virtually any environment we can imagine without leaving the studio.

On the other hand, Aximmetry offers us enormous flexibility. We can make changes to our scene while we’re live, and we can use different scenarios for different parts of our production.

Technology Comparison: Which Tools Are Best Suited for What?

The landscape of virtual production technologies is diverse, and not every solution is suitable for every use case. Here’s an overview of the most important platforms and their respective strengths:

Unreal Engine 5 vs. Unity: Both are powerful real-time render engines, but Unreal Engine has established itself as the industry standard for virtual production. The reason: Nanite and Lumen deliver image quality sufficient for professional video productions without requiring extensive optimization. Unity has its strengths more in interactive applications and projects with lower hardware requirements.

Aximmetry vs. Disguise vs. Zero Density: All three are professional virtual production platforms. Aximmetry convinces us with its price-to-performance ratio and direct Unreal Engine integration. Disguise is the market leader for large-scale productions and LED walls, while Zero Density is particularly strong in the broadcast sector. For our greenscreen-based setup, Aximmetry offers the optimal balance of performance, flexibility and usability.

Blender vs. Cinema 4D vs. Maya: For 3D modeling, we use Blender - not just because of its open-source license, but primarily because of the seamless FBX export to Unreal Engine and the active community that constantly develops new plugins and workflows. Cinema 4D has strengths in motion design, Maya in character animation - but for architectural virtual sets, Blender offers the best overall package.

Use Cases for Virtual Productions

The technology we use is versatile. Here are the key application areas where we successfully deploy virtual sets:

Corporate events and conferences: For company events, virtual sets offer the ability to create an impressive stage environment that matches the brand identity - without the logistical and financial effort of physical scenery. Especially for hybrid events, where part of the audience is watching remotely, virtual backgrounds ensure a visually consistent and high-quality experience.

Product launches and brand events: A new product deserves a presentation as innovative as the product itself. In virtual sets, we can present the product in any conceivable environment - from a futuristic showroom to a natural landscape to abstract brand worlds.

Livestreams and webinars: For regular live formats, virtual sets offer the advantage of being quickly adapted and reused. A set created once can be used for different topics and occasions with minimal adjustments.

Film productions and commercials: Virtual sets also offer significant advantages for pre-produced content. Scenes that would otherwise require expensive location shoots or elaborate set construction can be realized in the studio - with full control over lighting, weather and perspective.

The Advantages of Virtual Production for Our Clients

Why should companies choose virtual production? The advantages go far beyond mere cost savings:

Creative boundlessness: In virtual sets, there are no physical limitations. Whether an underwater scenario, a space environment or a photorealistic recreation of a historic building - everything is possible without leaving the studio.

Time savings: Building a physical set can take days or weeks. A virtual set can be created in the same time and adjusted in minutes if needed. Last-minute change requests from clients can also be implemented without problems.

Sustainability: Virtual productions significantly reduce material and energy consumption. No physical sets that need to be disposed of after shooting, no transport trips to external locations, no material wear and tear.

Consistency: A virtual set looks exactly the same every time it’s used. This is particularly important for series productions or recurring formats where a consistent visual appearance is required.

Scalability: From a simple background for a webinar to a complex, animated world for a product show - the effort can be flexibly adapted to budget and requirements.

The Future of Virtual Production

Development in virtual production is advancing rapidly. Some trends we’re observing and actively integrating into our workflows:

LED walls and in-camera VFX: While we currently work primarily with greenscreen and real-time compositing, LED volumes are gaining increasing importance. They offer the advantage that reflections and lighting effects already affect the protagonist correctly on set. For certain productions, we plan to use this technology as a complement in the future.

AI-powered content creation: Artificial intelligence is increasingly being used in texture creation, 3D object generation and animation automation. This significantly accelerates the creation process and enables us to deliver high-quality virtual sets even with tight timelines.

Interactive and data-driven sets: Virtual environments that respond to data in real time - such as live poll results, social media feeds or stock prices integrated directly into the set. This form of dynamic visualization will become particularly important for conferences and news formats.

Virtual production is here to stay. It’s changing not only how we produce content, but also what’s actually possible. As a team with deep technical expertise and creative passion, we at Jakobs Medien are ready to actively shape this development - for our clients and for the industry as a whole.

Get in touch

Tell us about your next project.