This is a real-time digital human test as to what is possible in 1 day.
Using Mesh to Metahuman we took a scan of a face and uploaded the result to MetahumanCreator.
Hair was groomed in Ornatrix. Body tracking, hand tracking and face tracking were recorded using the Rokoko suit and Live-Link,straight into Unreal Engine 5.
We couldn't resist ourselves: an Iron Man + Hololense prototype experience, it just had to be done!
Zebrar was tasked with developing a full content creation pipeline to allow for fast and iterative character animation and rendering. We facilitated an advanced facial and body animation system designed for quick and high quality stylised character production using the easily extensible UE5 blueprint system, also leveraging the high fidelity and fast rendering capabilities of Unreal Engine 5. The system is designed to incorporate data from different sources, enabling more comprehensive and versatile animation sequences.
Zebrars’ character blueprint process involves several pipelines.
The character blueprint is extensive and complex, but flexible with a significant amount of adjustment per sequence and face capture. However, the system allows realtime animations to run through an editor, replacing the need for previz. This streamlines the process by removing the necessity of multiple runs.
The bone data fed into the system can come from either the curve data (hand animation) or the live or pre capture setup. This data is then remapped to a predetermined range on a material that drives the eye movement, facial blendshapes, and neck rotations.
The system also allows for adjustments and ways to clean up unsatisfactory captured sections of animation, such as the addition of offsets in the sequencer, which provides additional customization for eye movement.
This realtime method creates a more efficient and comprehensive system for creating content for offline production.
The flexibility of multiple pipelines and the possibility of in-engine animation timeline adjustments allow for nuanced control and believable animations.
Despite the complexity and initial setup challenges, the system enhances the animation and rendering process, providing a solution to the costly and lengthy process of traditional content creation.
After seeing the recent evolution of QR Code art using stable diffusion - we couldn't help ourselves. QR codes are a very prominent aspect of our interactive engagements & digital activations, but let's be honest, they don't exactly scream: brand experiences. So here we are, our first experimentation with AI generated QR Art.
Made to showcase Web augmented reality technology, see how easy it is to have turtles floating in your living room!
Hidden Valley is an immersive web experience that showcases our real-time 3D workflow for high fidelity virtual worlds and metaverse creation tools.
Try on Android, Iphone, PC, Mac, and Meta Quest 2
For iPhone Users: This game supports Apple devices from iPhone X onwards running iOS 15.0 or later
An exploration of what an AR Portal world could look like for a premium fashion brand and how audiences can immerse themselves and engage with the brand and products in a unique and innovative way.
Try on Android WebXR enabled devices by scanning the QR code or clicking the link-
AR not available on iphone **
Photo-real cities directly in Unreal Engine 5!
The partnership between Cesium and the Google Maps Platform has opened up so much for Unreal Engine 5 users.
This is a super quick UE5 RnD test that took 15 minutes to set up and is playing in real-time on an RTX3060.
Once you work out how the Google API key system works, plug in your latitude and longitude co-ordinates and transport to anywhere in the world. The Google tiles just stream in. Some tiles continue to pop but im sure there will be more features to come.
Great for Real-time Experiences, and 3D visualisations on that next job!
Unreal Engine 5 - Better, faster, real-time! “This demo test drives UE5 and Lumen with digital humans. The real-time character was built with Metahuman Creator. Facial animation was achieved via the Live Link App capture tool on my iPhone 11. One performance take and zero animation cleanup. I also used my iPhone 11 for the camera moves. The environment was built with Unreal Marketplace assets and Megascans for greater detail. Birds were built with Niagara particles. The demo plays at 60fps on RTX 2070.” Zebrar VFX Supervisor, Andrew Lodge
Unreal Engine 5 - Better, faster, real-time! This demo test drives UE5 and Lumen.
Say hello to MetaJohn! This is John Doolan, Lead Software Engineer at Zebrar, aka "Aussie John".
We took a 3d scan of John's head a few year's ago and thought we would upgrade it in Unreal Engine 5. Using UE5's Mesh to Metahuman we were super surprised by the results. John is now fully digital and real-time ready after a couple of hour's work. Super excited to control this character in our Motion Capture suit.