Unreal Engine 5 Tech Demo on PS5 Shows Where Next-gen Graphics are Headed

26

Epic Games today revealed a PS5 tech demo built with Unreal Engine 5, the next-gen version of the company’s game engine. With new features for advanced lighting and unprecedented geometric detail, Unreal Engine 5 hopes to enable a generational leap in real-time graphics.

Unreal Engine is one of the two most popular game engines for creating VR content. While UE 4.25 just launched last week with improvements to its AR and VR support, Epic Games today showed off a tech demo built with new foundational capabilities of Unreal Engine 5 which the company plans to launch in 2021.

“One of our goals in this next generation is to achieve photorealism on par with movie CG and real life, and put it within practical reach of development teams of all sizes through highly productive tools and content libraries,” Epic says.

Running on PS5 developer hardware, the aptly-named tech demo ‘Lumen in the Land of Nanite’ shows off UE5’s Lumen global illumination system and Nanite micro-geometry system. Here’s how Epic describes the features:

Lumen is a fully dynamic global illumination solution that immediately reacts to scene and light changes. The system renders diffuse interreflection with infinite bounces and indirect specular reflections in huge, detailed environments, at scales ranging from kilometers to millimeters. Artists and designers can create more dynamic scenes using Lumen, for example, changing the sun angle for time of day, turning on a flashlight, or blowing a hole in the ceiling, and indirect lighting will adapt accordingly. Lumen erases the need to wait for lightmap bakes to finish and to author light map UVs—a huge time savings when an artist can move a light inside the Unreal Editor and lighting looks the same as when the game is run on console.

Nanite virtualized micropolygon geometry frees artists to create as much geometric detail as the eye can see. Nanite virtualized geometry means that film-quality source art comprising hundreds of millions or billions of polygons can be imported directly into Unreal Engine—anything from ZBrush sculpts to photogrammetry scans to CAD data—and it just works. Nanite geometry is streamed and scaled in real time so there are no more polygon count budgets, polygon memory budgets, or draw count budgets; there is no need to bake details to normal maps or manually author LODs; and there is no loss in quality.

In its UE5 tech demo reveal today, Epic didn’t mention PSVR 2 (which is expected to be announced after the launch of PS5), but the company did confirm that VR and AR content creation for console, PC, and standalone will continue to be supported in UE5.

SEE ALSO
PSVR 2 Unlikely to Launch at the Same Time as PS5

While this demo is extremely impressive and significant for the future of real-time graphics, it will likely be a while yet before we see this level of graphical fidelity in VR games and content.

Image courtesy Epic Games

While Unreal Engine 5 will continue to support VR development, the demo shown today was running at 2560×1440 at 30 FPS, which is far too slow for high-end VR headsets, most of which require a minimum of 80 or 90 FPS, with some demanding even higher resolutions.

VR developers will likely be able to make use of Lumen, Nanite, and other advanced UE5 features, but perhaps not at the same scale seen in the ‘Lumen in the Land of Nanite’ tech demo—at least not until next-gen graphics hardware is much more prolific.

Image courtesy Epic Games

Outside of VR games, UE5 is likely to be especially useful for enterprise VR and AR use-cases involving visualization, which typically require that detailed computer models be reduced in complexity in order to run in real-time. With Nanite, Epic is promising that UE5 will be able to ingest huge models like photogrammetry scans and CAD data, then display them natively without needing to create decimated versions with reduced geometric accuracy.

– – — – –

Epic says that Unreal Engine 5 will be available in a preview version in early 2021 and launch in full later that year. The engine will support next-gen consoles and all existing platforms including tethered and standalone AR and VR headsets.

The company also says that UE5 is being designed with forward-compatibility in mind, so developers working with UE4 now can expect to migrate their projects to UE5 when the time comes. To lead by example (and sort out the kinks along the way), Epic plans to launch its battle royale hit Fortnite on next-gen consoles in its current UE4 version, and then move the game over to UE5 in mid-2021.

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. See here for more information.


Ben is the world's most senior professional analyst solely dedicated to the XR industry, having founded Road to VR in 2011—a year before the Oculus Kickstarter sparked a resurgence that led to the modern XR landscape. He has authored more than 3,000 articles chronicling the evolution of the XR industry over more than a decade. With that unique perspective, Ben has been consistently recognized as one of the most influential voices in XR, giving keynotes and joining panel and podcast discussions at key industry events. He is a self-described "journalist and analyst, not evangelist."
  • MatBrady

    I want to believe… I really do. But over the years I’ve learnt to manage my expectations when I see a tech demo from an unreleased Playstation.

    • kontis

      UE3 tech demo for PS3 looked much worse than many games later released for PS3.

      UE4 tech demo for PS4 looked much worse than Paragon later released by Epic on PS4.

      • Sven Viking

        So UE version numbers tend to coincide with PlayStation numbers :).

  • Alex

    jawdropping

  • kontis

    Yeah, it barely runs on PS5, so Lumen is not gonna be used in VR on next gen consoles.

    However Nanite with simpler materials (maybe not PBR shaders) could be really awesome for VR, because geometry is what shines with sterescopic 3D and parallax.
    They may even try to optimize the meshlets in a similar way to VRS and render more triangles in the center of each eye.

    The biggest problem with Unreal Engine is the fact Epic rendering guys don’t like forward rendering. It’s a second class citizen hidden deep in settings with many features unavailable, so devs are basically punished for using it for VR.
    I doubt Lumen will even work with it. Maybe even Nanite won’t… They already said they have no plans to support any kind of raytracing, despite it being fundamentally better suited for forward shading.

    It’s really a shame that the best engine on the market kinda discourage you from supporting VR in your game…

    Unity on the other hand treats forward shading as a core, first-class feature of all their renders, which will benefit VR devs greatly, so don’t expect to see a grand migration in this field to UE.

    Hopefully, with the progress in temporal accumulation and deep learning upscaling like DLSS, deferred may one day be as good if not better for VR as forward shading.
    For now Valve proved that forward is the king for VR.

    • Andrew Jakobs

      Why wouldn’t Lumen be used in VR on next gen consoles? It doesn’t mean you HAVE to use the ultra high polygon count models, you can still use lower count models which would still look so much better than we use today.
      And what do you mean ‘they have no plans to support any kind of raytracing’, the current 4.25 supports raytracing..
      And sorry to burst your bubble, but A LOT of VR games are build with the UnrealEngine..

      • James Cobalt

        You misunderstand. Raytracing is supported in the deferred renderer only – not the forward renderer that’s greatly preferred in VR development. Epic stated they have no plans to support it either. When @kontis:disqus says they treat it like a second class citizen, he’s not inaccurate: forward rendering in Unreal does not support raytracing, SSR, SSAO, GBuffer scene textures, many types of shadows and translucency effects…

  • Jarom Madsen

    Very cool tech but honestly not sure how I feel about encouraging artists to keep full cinematic polycounts of 3d models in final production builds. You’re talking about tripling asset sizes which is worse for download size and asset loading times. Not super excited about the 200gb+ download sizes coming soon to an Unreal game near you.

    • Miqa

      How often do they have access to that though? Seems pretty great to me that you wouldn’t need to craft multiple LOD for the assets. Spending the resources elsewhere should open up other possibilities.
      Not sure how that works, but if you have to keep separate files for different LOD, then this could even save download size.

      • Jarom Madsen

        I’m skeptical that this will end up being the case the majority of the time. IMO encouraging artists to think even less about optimization is a dangerous direction to go albeit the natural progression we’re drifting towards and the ideal I’m sure for artists. I’m prepping myself for ever increasing load times and download sizes since it seems less and less effort is being put into reducing that facet while we keep pushing for marginally improved realtime graphic fidelity with larger and larger assets.

        It still boggles my mind that 100gb+ download sizes for PCVR like Asgard’s Wrath is acceptable as it is and we’re opening the floodgates for it to soar past that.

        Meanwhile mobile VR on Quest at least is motivated in reducing asset sizes to fit on a standalone headset which I appreciate but something tells me we’re going to start pushing that envelope soon too.

        • realist

          Why would anyone advocate sacrificing a significant upgrade in immersion potential & graphic quality for impatience over a download speed & I guess the same on load time? Give the artist their opportunity to create something awesome and the devs the saved budget from LOD creation and optimisation to deliver better more complex game play or maybe even more levels. Personally I would play the long game, pour a beer whilst I wait then enjoy an experience with less compromise.

          • Jarom Madsen

            But I mean, longer load times is a compromise. If that’s more or less compromise than not having full res assets, that’s going to vary from consumer to consumer. Plenty of games get ratings tanked because of long load screens and you do block accessibility to users with limited storage space or slow download speeds. I’m not saying that the tech isn’t cool or useful, just that their pitch that artists can ignore asset optimization now is not my favorite.

          • James Cobalt

            Case in point – I never, ever play Dance Central. I like it, but I play OhShape, Beat Saber, Audica, et al because DC takes way too damn long to load. I often don’t even start Alyx because sometimes I just have 20 minutes and I don’t want to spend 4 of those waiting for the engine to load… and then my save file to load. Many people are operating with sporadic and limited free time these days – being able to just drop into a game quickly has real appeal.

            Teenager James, 80 hours deep into FFVII for the 2nd time, wouldn’t have fathomed this. Adulting James is thankful for Nintendo Switch.

          • Octo

            It was inevitable. Eventually geometric detail would reach this level, or we’d never reach truly photo-realistic environments.
            There is no way, or need to go higher than sub-pixel detail geometry…so if it’s of any consolation, this is as far as it’ll go, it’s done.

    • Charles

      Larger downloads could be annoying, but if they improve the visuals then it’s worth it.

      Historically, game size almost always increases greatly at each new console generation (PS3 to PS4 being an exception). We’re overdue for a major game size increase.

    • Andrew Jakobs

      200GB+ for a beautiful game isn’t a problem in 2-3 years..

    • Sven Viking

      I was thinking about that too. I wonder if UE5 could use similar tech to export a cut-down version of the insane-quality source model for distribution?

      • Jarom Madsen

        I mean, yeah, probably, but it seems to be based on camera perspective which is not as useful for “baked decimation” since what looks good from one angle won’t necessarily look good from another. That said, the tech is cool and I am excited about it, just not completely on board with completely throwing out general asset optimization altogether just because this *can* handle them.

  • Andrew Jakobs

    Man, when I saw that demo I almost pissed myself from excitement. This looks so cool and will make it even more easy to develop great looking games. Even with toned down settings this will look better than what we have today, so seeing that in VR would really be wonderfull.

  • Am I the only person that finds it a bit hard to see how much better this looks over the PS4’s graphics…. or even the PS3’s? A well made, well optimized game looks just as good. It seems like they are up-selling sloppy models, ie models that haven’t been well planned or optimized. Fine detail is the realm of Normal maps and Parallax maps, not abusively high polygon counts.

    You don’t know just how much can be squeezed out of well made, well optimized models until you see something like Halflife-Alyx running on a low-end machine vs a high-end machine… and you see how little different they look…. and you realized… *THIS* is what quality artwork made by masterful people looks like.

    • James Cobalt

      The video isn’t so much about how it looks but how much easier it is to achieve those looks. You can do all of this in UE4 but it’s more time and money (and probably heavier on the CPU/GPU)

    • Octo

      Normal maps is the effort to make a low poly model look high poly. Parallax is a step over that, and tessellation a step over that, but in the end they are just approximations of the heavy model…so obviously nothing is gonna beat using the heavy model directly.
      On the surface a ps4 game could look comparable to this..but on closer inspection it’s not close, and doing all that optimization and hacks to get this level of quality on a ps4 is time consuming and frankly boring work.

  • James Cobalt

    Graphically it doesn’t look better than what UE4 is technically capable of displaying, but from a development standpoint, it seems like a big leap forward. To pull off this look in the current version you’d need to do serious LOD for models and textures, bake a bunch of lightmaps, animate some stuff by keyframe… here the engine is handling much more dynamically and automatically. Relatively little work to achieve what in UE4 would be too much of a burden for many game studios.

    But this end result, graphically, shouldn’t be that mindblowing to end users. We should also take it with a grain of salt – we don’t know enough about these new technologies to know their relevance for VR. Not every feature in Unreal is VR compatible (like screen space reflections) and many aren’t performant enough for use in VR regardless (even if you have an excellent rig).

    UE4 demo with an end result that looks just as impressive: https://www.youtube.com/watch?v=zKu1Y-LlfNQ

    • david vincent

      Well, you know, more you are close to photorealism, less you see progress…

  • As a developer, I can tell you that while Lumen is great, Unreal had great lighting. IMHO it is Nanite the star of the show: it is very hard to create something that has the right poly count, maybe you are on a budget and you look for online models like on Turbosquid and they’re always too big… now you just use whatever you want and the system will adapt at runtime depending on the platform, this is huge

  • david vincent

    Great, the “Virtual Set” technology will be even better with UE5.
    https://www.youtube.com/watch?v=bErPsq5kPzE
    https://www.youtube.com/watch?v=gUnxzVOs3rk