Epic’s Stunning RTX-Powered Ray-Tracing Demo Wows GDC

(blogs.nvidia.com)

95 points | by bcaulfield 1857 days ago

17 comments

  • westoncb 1857 days ago
    I am actually underwhelmed by this demo... Not because it doesn't look great—it does—but because this is about par for the course for a next-gen GPU demo. They always look fantastic for definite reasons, not strictly hardware improvement.

    Remember nvidia's 'Dawn' demo from 2003? https://youtu.be/4D2meIv08rQ?t=48

    They put together super high quality demos like that for every new generation of card they released. The games that followed never looked quite as good since there are far more demands put on one's computational resources to generate a playable game world vs. a single scene that needs to look great.

    What would be potentially very interesting is to see the difference in work put into producing this with raytracing vs. traditional techniques. Being able to displace the unholy nightmare hodgepodge that comprises contemporary real-time rendering algorithms is the most exciting thing about raytracing. At least that's potentially the case; not clear to me how much improvement has been made possible by the RTX.

    EDIT: and, making me think this new one is part of a lineage of sorts, here's a later Dawn demo from 2012 (I think): https://www.youtube.com/watch?v=bI1_quVr_3w

    EDIT2: rewatching, other aspects of this demo (i.e. not rendering specifically) are beyond 'par for the course', imo. But it's just generally higher production values, indicating more time/money/attention was put into producing the short itself, independent of the hardware backing it.

    • _bxg1 1857 days ago
      There's truth to this, although I find the demo easier to appreciate when I focus specifically on the things that benefit from RTX. That said... I think they could've picked a better scene to that end. The original stormtrooper one was great, but an outdoor scene (fewer surfaces reflecting indirect light) with a body of water (a flat reflection which can be easily achieved using non-RT techniques) doesn't really play to the technology's strengths.

      Side note: Geez, that linked video says so much about how much gaming culture has evolved in a decade and a half.

  • zaroth 1857 days ago
    So much for uncanny valley, to me that just looked fabulous.

    I don’t quite comprehend the development process to build a short line this. Everything from the surroundings and environment, to the humanoid model and making natural movement, obviously the facial modeling and expressions, the water effects.

    The sound track is probably the only thing I feel like I have a decent mental model of how it comes together.

    I took one 3D graphics class in undergrad and it was all so absurdly low level basics, most just mathematics, it never remotely showed how something like this could actually be put together.

    • dman 1857 days ago
      I used to work for a game studio a long time ago. Here is a rough sketch of what the process looked like for that game studio

      a. Concept artists will create multiple possible worlds by drawing either using a physical medium or on a tablet.

      b. Art Director will critique concept art and iterate with concept artists until he finds a world he approves. Concept artists will then draw many scenes in this imaginary world to confirm that there is no miscommunication between the Art director and the concept artists.

      c. Some of the key characters and scenery from this world will be given to Modelers who will create 3d models of those assets.

      d. Initial goal will be to bring one or two scenes to life.

      e. Once the initial few scenes and main characters are fully approved, more headcount will be added and the creation process becomes more parallel.

      f. Everyone involved watches the short multiple times a day to monitor progress. There is accountability for who owns a particular character for a given frame.

      • newnewpdro 1857 days ago
        I got the impression the parent was referring to the technical side of rendering such photo-realistic visuals real-time.
        • dman 1857 days ago
          Ah, sorry misinterpreted the question.
          • rayval 1857 days ago
            Regardless, I found your description of design process & workflow extremely illuminating (no pun intended). Have not seen this succinct and informative descrption anywhere else.
            • ct520 1857 days ago
              Yep me 2. Thx dman
    • VectorLock 1857 days ago
      Blog posts that dissect the rendering of things like GTA 5 really show how much goes into making scenes like this. http://www.adriancourreges.com/blog/2015/11/02/gta-v-graphic...
      • ska 1857 days ago
        Wouldn't part of the point of realtime raytracing would be to avoid a lot of this multipass trickery in the first place and just have a more sophisticated light transport model?
        • yuhe00 1856 days ago
          Realtime raytracing hooks into an existing multipass pipeline like this by replacing select traditional techniques. The easiest to replace are shadowmaps and AO. You can also make use of radiance (diffuse GI) and reflections with one or two bounces. Traditional techniques are currently faster and give better results for just about everything else. Also, you can only afford maybe 1-2 raytrace samples per pixel at the moment, even with RTX, so they make heavy use of AI denoising techniques.
        • zamalek 1857 days ago
          I have no idea how RTX actually works, but based on the fact that exploits the existing rasterization architecture, I assume it adds at least one (very fast) pass.
          • ska 1857 days ago
            Yes, that seems likely. But can you use it to drop some of the others?
            • fivefive55 1857 days ago
              Probably, yes. The only issue I see with that is the fact that Raytracing is only supplementing current rendering methods, not replacing them yet. The performance is not there yet. In a few generations we might see fully raytraced games, but not for a little while I would guess.
        • foota 1857 days ago
          Could be wrong here, but I think most uses of raytracing like this limit it to shading and use traditional method for shadows. (Shading being the way that light "appears" on surfaces, shadows determining roughly how well illuminated a surface is. In theory you could use raytracing for both, but there may be limitations?)
    • enraged_camel 1857 days ago
      >>I took one 3D graphics class in undergrad and it was all so absurdly low level basics, most just mathematics, it never remotely showed how something like this could actually be put together.

      I feel like this is a huge problem in higher education especially: bottom-up learning leaves students bored, confused and demotivated because they don’t understand the purpose of what they are learning and have no immediate use for it.

      Whereas top-down learning, where you start with the end result and work backwards/downwards from it, can provide the enjoyment and motivation necessary to persevere, especially when the subject matter is challenging and the road long and arduous.

    • sorenjan 1857 days ago
      They're gonna hold a "Making of Troll" tech talk this week, hopefully they release video from it.

      > NVIDIA will be hosting a demo and “Making of Troll” tech talk with Epic’s Marcus Wassmer and Juan Cañada at this week’s GPU Technology Conference in San Jose.

      https://www.unrealengine.com/en-US/blog/troll-showcases-unre...

    • stronglikedan 1857 days ago
      > So much for uncanny valley

      To me, the mouth always seems to give it away. Fabulous nonetheless though.

      • noneeeed 1857 days ago
        Yeah, that's always been my issue too. There seems to be something about the elasticity of the human face, and the interactions of all the small muscles, that seems to be especially difficult to model.

        This demo does at least seem to deal better with the darkness in the mouth. That has also always been problematic and made a lot of characters look positively demonic :)

        • noir_lord 1857 days ago
          Micro expressions are a pain to animate and yet every human we interact with has them pretty much, our brains are very effective in reading them as well since for our ancestors they where literally life and death.

          Without them I think our brain rings all the alarm bells, a human face without them is odd and possibly threatening.

      • apk-d 1857 days ago
        I liked the way mouth movement was captured in LA Noire. It was a bit low-res, but looked way more organic than any other technique I've seen. I'm guessing the production is complicated and the cost is prohibitory though.
    • twic 1857 days ago
      > making natural movement

      To me, the movement didn't seem natural. But what it did seem like is classic Disney animation!

  • jeswin 1857 days ago
    One of the things I hate about modern games is that while the environment gets closer to realism, humans (face, skin) still look like corpses. I prefer cartoonish characters instead of being in uncanny valley.

    The woman in the demo looked somewhat real; if this is real time and reproducible in games then it's a huge step forward.

    • dragontamer 1857 days ago
      > One of the things I hate about modern games is that while the environment gets closer to realism, humans (face, skin) still look like corpses. I prefer cartoonish characters instead of being in uncanny valley.

      Subsurface scattering is necessary for face / skin to work with light. You get good subsurface scattering in movies these days, but not really in video games yet. Its far too costly to make realistic skin renders.

      Skin is partially transparent: you can see the veins, blood, and even muscles through people's skin. Blushing is simply the act of blood rushing to the face, its immediately visible and instantly causes a reddish hue to due the skin's transparency.

      Simulating that outside of movie settings is going to take far more GPU power than what is available even in the next 5 years. We're still working on basic raytracing for video games, not even the advanced stuff yet.

      ------------

      In effect, the "location" of where light is reflected on skin is "inside" the skin. Light penetrates into your skin and illuminates the stuff underneath. But traditional raytracing assumes that the reflection occurs on the "surface" of skin, resulting on a "waxy" look. The reflection is subtly in the wrong place.

      https://en.wikipedia.org/wiki/Subsurface_scattering

      Without subsurface scattering, you get the "barbie doll" look, even in raytracing. "Barbie Doll" works fine for cartoons, but its always a killer for me personally when I'm playing video games.

      EDIT: Example image with SSS (SubSurface Scattering): http://2.bp.blogspot.com/-ahUMngIOPqU/T85dlEU9I7I/AAAAAAAAA5...

      ------------

      Gems, Marble, etc. etc. are also partially translucent, and look awful in traditional video games. Subsurface Scattering makes these partially transparent rocks look more correct.

    • worldsayshi 1857 days ago
      This is real time. It's a bit annoying that the article doesn't lift that point since that is the most impressive point. Then again, I guess unless you don't include interactive aspects there's no telling what is actually real time and what is "baked".
      • JohnBooty 1857 days ago

            there's no telling what is actually real time and what is "baked".
        
        Yea, that distinction is... fuzzy, and a little frustrating.

        Artistically, this demo is wonderful!

        Technically, it's somewhere between "yawn" and game-changing, depending on how much of the lighting and animation is pre-baked. Are the deformations of her skin, clothing, the water, etc. being calculated in real time, or was all the hard stuff precalculated and baked in?

        Reason why this matters is because the more dynamic this is, the closer we are to creating actual emergent/interactive experiences with this kind of technology, rather than just a way to render static cutscenes better.

      • bcaulfield 1857 days ago
        You're right, we've updated the post to reflect this.
    • JohnBooty 1857 days ago
      Street Fighter IV and V have my favorite implementation so far. The characters, to me, look like realistic physical objects with a good sense of weight to them.

      But instead of resembling humans and therefore falling into the uncanny valley, they look like realistic lifesize action figures like the ones I played with as a kid. Like my childhood play scenes come to life.

      (I'm not sure if the creators intentionally wanted the characters to look like action figures per se, but clearly they made the choice to go for a stylized rather than photorealistic look and I love what they achieved)

      https://www.pcmag.com/review/341914/street-fighter-v-for-pc

      • dragontamer 1857 days ago
        The stylized "comic book" feel of Marvel vs Capcom 3, and the "DBZ Feel" of Dragonball FighterZ are the looks that I personally like the most.

        Dragonball FighterZ is nothing close to reality, but whatever they did to that game makes it look just like hand-drawn 2d anime.

        Here is some random tournament footage: https://www.youtube.com/watch?v=6XOBrRhgv0I

        --------

        Whatever Blue Sky did for the "Peanuts" 3d movie was utterly amazing too. It was the first major 3d movie I've seen that faithfully captured the comic style. Its not a video game, but its a similar "art-form translation" issue.

        https://www.youtube.com/watch?v=zQpUQPrAfQM

        • JohnBooty 1851 days ago
          Yeah, FighterZ is just incredible!

          The Cyber Connect 2-developed games, mostly in the Naruto universe, have been making great progress along those lines for years. But DB FighterZ takes the cake.

          One thing that's "special" about Capcom's recent SF games is that I find the art style unique!

          FighterZ is an absolutely incredible technical artistic achievement, probably moreso than SF IV/V. I would call FighterZ the more impressive achievement.

          On the other hand, FighterZ replicates an existing look, whereas SF IV/V veers into some new territory IMO so I give it credit there.

    • ksec 1857 days ago
      >The woman in the demo looked somewhat real; if this is real time and reproducible in games then it's a huge step forward.

      Agree, this is the first time ever, for a few seconds I have a question in my mind, was this really animated or some graphics overlay on actual human face. I think we are still far away, say at least another 10 years from technically possible replacing human actors. Not saying it will just the technicality of it.

      But I do wish games be more cartoonish, or Animation, Cel Shading. The recent Naruto and Dragon Ball Z made those games as if they were OVA quality Animation. I am not sure if market wasn't buying it, or if those technique and level of quality are still hard / expensive to achieve.

    • tracker1 1857 days ago
      I have no knowledge of what's considered state of the art... but I'm curious if we are at a point were we could get "good enough" animation real time. From south park to toy story level. I know there's a lot of time that goes into building sets/worlds and other models, but would love to see the option become more common.

      Local TV stations with animation shows, much like local entertainment used to be, but animated instead of fully acted live.

    • _bxg1 1857 days ago
      The parts of our brains that identify human forms have been finely honed over thousands of years; humans have always been the hardest subject to get right in any artistic medium. I think we're getting close here, though.
  • AWildC182 1857 days ago
    Can someone explain why the motion capture for this animation looked so much more realistic than it usually does? Character animation always looks over-smoothed or something to me but this is probably the first time I've seen where it doesn't have a stilted appearance.
    • baddox 1857 days ago
      It could just be that they spent an inordinate amount of time and money cleaning up the motion capture data for this demo. It's my understanding that the mo-cap in big-budget films is heavily modified by professional animators. Most hours-long video games probably don't have the budget for that.
      • AWildC182 1857 days ago
        What about the source data makes it so difficult to clean up? It feels like nothing has progressed since the days of the first ping pong ball suit even though cameras and image processing have become infinitely more capable.
        • blihp 1857 days ago
          The fact that there's so much of it and the sources are indirect and imprecise. All motion capture with actors is from the surface of their body and then you have to reverse-engineering all that data into skeletal and facial muscle positions.[1] The techniques are very cool but not perfect and there is error at every step of the process.[2] So then you still need tools and people go back through the data to clean it up / smooth it out... and even then, it's still not going to be perfect.

          [1] For some strange reason surgical procedures to implant hundreds of hyper-accurate mo-cap sensors in the actors or otherwise hack into their nervous system hasn't become a thing. ;-)

          [2] Camera lenses aren't perfect, dot placement (for marker based systems) isn't exact, tracking software slips since the markers aren't infinitely small and can move around a bit etc.

          • danmaz74 1857 days ago
            This looks like an area where machine learning could really help... Have you got any idea if there is progress here?
        • baddox 1857 days ago
          I can only speculate. For faces, I'm sure there's tons of manual work to be done to reduce the uncanny valley, especially when it's being mapped onto a different facial geometry (like King Kong). For body movements, perhaps the collisions and interactions with the virtual world require a lot of manual tweaking to get them looking realistic.
  • ripsawridge 1856 days ago
    "Tell us how you really feel," asked no one, ever, but it's a free country so--

    Yet another way to draw us away from actual connection with a painfully and joyously real world outside.

    Better to long to just once, enumerate and follow the subsurface scatter of light on the face of your beloved than to coo and preen over mere mechanism.

    A dying culture turns away from whats out there. Rolls over and wishes to slumber in dreams where everything is controlled and safe. Hence, the computer.

    I finally figured out what I hate about CGI the most: the animated figures convey no dismay at the weight of their bodies. It's how that dismay is coped with that injects grace into movement. The princess here looks nice enough until she moves. Then it becomes clear there is no soul within the frame that first suffers the pain of an ungainly body coping with gravity, then masters itself, and resolves to move as best it/she can anyway.

    That is beauty: the acceptance of limits. The carrying on despite them. You can see this in real people when they walk.

    Until the entire inner world is modeled, these constructions have less life that a good puppet. A puppet at least telegraphs the spark of life from the hand and soul that control it's strings. That spark is constant, microsecond communication and feedback.

    Stop trying to create simulacrums. Live the life you've been given.

    [Returns to computer job. Resolves nonetheless to walk under the moon tonight].

  • mikepurvis 1857 days ago
    Alicia Vikander credited at the end, presumably for the mocap and vocal performance? It's cool the amount of cross-over there this these days between high performers in different disciplines of media. How many Hollywood triple A stars would have been down to do mocap for a 90 second gaming tech demo video even ten years ago?
    • sorenjan 1857 days ago
      Also has music by Ludwig Göransson, who won an Oscar for the music in Black Panther.

      The video is made by a Swedish production company (Goodbye Kansas), and it's inspired by the art of John Bauer, one of the most recognized artists in Sweden, so maybe that helped getting the A-lists interested.

  • kowdermeister 1857 days ago
    Another awesome RTX demo from Crytech:

    https://www.youtube.com/watch?v=1nqhkDm2_Tw

  • JabavuAdams 1857 days ago
    And then we're going to use these awesome visuals to run around burning the forest, murdering the creatures, tea-bagging the photo-real corpses of our fallen foes, all while unable to have a good in-character conversation with an NPC, and being forced to listen to teenage voices hurling racial and sexist slurs.

    Plus ca change.

  • sigi45 1857 days ago
    The first time I read about rtx and raytracing I thought "holy shit so that's how it will happen and I will see it happening".

    The Transition started and now it's just a question of a little bit of time.

    Really looking forward to everything happening now on that level :).

  • kraig 1857 days ago
    This looks great.

    In a few short years I'll be able to get a VR headset and install a computer inside of one of these games so I can work remote from basically any virtual world. Then i'll only need to unplug and face reality to get the Postmates.

  • m0zg 1857 days ago
    Somehow the mouth of the model still looks "weird". Maybe because humans are hardwired to pay more attention to it?
  • shmerl 1857 days ago
    DXR and Nvidia sounds way too locked-in. There should be more collaborative effort to do that through Vulkan and on all GPUs.
    • skrowl 1857 days ago
      • shmerl 1857 days ago
        It's still MS only, so a bad option.
        • esyir 1857 days ago
          Gaming on non-MS PC platforms is a pretty small niche anyway though. I can see the appeal, but my guess is that the work to newly available marketbase ratio here isn't all that great.
          • shmerl 1857 days ago
            So, good effort would be to help it grow, instead of perpetuating MS lock-in.
          • saati 1856 days ago
            Playstation is nieche?
    • jimminy 1857 days ago
      AMD actually uas released a real-time ray tracing library that’s included in Unity, and is Vulkan compatible, called Radeon Rays. [0]

      And Crytek released a demo last week showing real-time ray traced reflections rendered on a Vega 56. [1]

      [0]: https://gpuopen.com/gaming-product/radeon-rays/ [1]: https://www.youtube.com/watch?v=1nqhkDm2_Tw

      • shmerl 1857 days ago
        Yep, that's more like it.
    • TomVDB 1857 days ago
      If your concern is lock-in, then DXR is currently your best option: DXR works for both AMD (with Microsoft's fallback path) and Nvidia, while the only existing Vulkan extension is Nvidia proprietary.

      What kind of collaboration do you expect? AMD hasn't been paying a lot of attention to its GPU business the past years.

      • shmerl 1857 days ago
        DXR is far from a good option, since it's a proprietary MS only API.

        > What kind of collaboration do you expect?

        Common Vulkan implementation focused on ray tracing that isn't tied to GPU (and OS naturally, since it's Vulkan).

    • tntn 1857 days ago
      How is DXR any more locked in than DX-anything else?
      • shmerl 1857 days ago
        It is pretty much same lock-in as DX anything, so same bad at the later.
  • aresant 1857 days ago
    Do they release these as executables ?
  • craftyguy 1856 days ago
    How is this article praising nvidia products on nvidia's website not spam?
  • sagebird 1857 days ago
    the smoke that comes from the fairies is a bit deterministic for my tastes -- looks like it is being driven by a trigonometric function and doesn't have much turbulence or interaction with local environment/surfaces.
    • _bxg1 1857 days ago
      The main goal was to show off raytracing; advanced particle systems aren't a new thing.
    • eps 1857 days ago
      No wireless. Less space than Nomad. Lame.

      :)

      • darkpuma 1857 days ago
        cmdrtaco is wrongly maligned, the ipod would not become massively successful until further revisions were released.
        • cmdrtaco 1856 days ago
          Thank you for setting the record straight!
        • 0-_-0 1857 days ago
          Did they have wireless and more space than a Nomad?
          • darkpuma 1857 days ago
            Connectivity (USB, rather than firewire) and more storage were certainly among the facets improved in subsequent versions. USB isn't wireless, but with USB rather than just firewire, the lack of wireless is less of a problem for mass market adoption, isn't it? I never had a nomad, I couldn't tell you how much storage they had, but I do know that my first ipod had a lot more than a measly 5GB; it was a 5th generation with 30GB.

            ipod sales were basically fuck-all until Gen 4/5, meaning cmdrtaco was basically right for three or so years: https://en.wikipedia.org/wiki/File:Ipod_sales_per_quarter.sv...

            I don't think it a coincidence that Gen 4 and 5 were the first to fully support USB (gen 4 also supported firewire, gen 5 did not.)

            The iphone took 6 quarters to breach 5 million sales per quarter (remember that as it was originally released, the iphone was an impressive touch screen web browser with subpar data and no appstore!) The ipod took 14 quarters to breach 5 million sales per quarter. After that, both took off. My point here, is the ipod was much slower off the line than the iphone. Simply put, it was lame (until it wasn't.)

  • izzydata 1857 days ago
    Maybe I'm crazy, but I don't feel the need for such realism in video games. Ray tracing is an awesome technology for things like pre-rendered video, but real time raytracing that takes up a huge portion of the graphics card computational power seems like a waste. I'd rather see more frames per second at higher resolutions.
    • Crespyl 1857 days ago
      From discussions I've seen elsewhere and my own limited knowledge, I think a lot of the potential benefit in the near future is likely for the artists/designers more than the end users.

      Being able to quickly place lights and design materials for a scene with realistic ray-traced rendering gets believable results in simpler fashion than by having to painstakingly recreate all the details of natural lighting with existing conventional methods.

      It requires a lot more of the end-user hardware, but overall gives the artists more room to focus on things other than making sure the lighting and reflections aren't distractingly wrong.

      • fsloth 1857 days ago
        Actually empowering artists will benefit the end users with better visuals. Some people are happy with textures polygons but personally I enjoy cinematic art as much as other type and cinematic story telling wiht all of it's subtleties will definetly benefit from this.
      • izzydata 1857 days ago
        That sounds like a good application of this technology. If Nvidia had some kind of split product line for professionals and gamers. Kind of like this exact thing they already did with the Quadro cards..
        • mattnewport 1857 days ago
          You're missing the point, for realtime rendering it's not enough for the content creators to have this technology to get the benefits, it has to be used on the target end user hardware too. Getting things to look good with traditional rasterization based pipelines involves a lot of special case hacks and tricks and work arounds for situations where they break down or don't compose. That is a lot of work for the artists to accommodate the limitations of the target platform.

          The promise of raytracing is that you can get great results without a lot of tricks and special casing which frees up the content creators to focus on the visuals they want to create without worrying so much about technical implementation details. That only really works if the target platform supports performant ray tracing as well as the content creation tools.

    • dragontamer 1857 days ago
      There's nothing "realistic" about a crown erupting in magic flames and dancing on the water on its own however.

      But making that look good is the question: the human eye has expectations for how the fire effect is supposed to look when reflected on the water, and if the fire is to be "believable", it needs to light up the area around it.

      Its clearly a demo explicitly designed to take advantage of Raytracing (a dynamically moving source of light), which is hard to emulate with traditional effects.

      --------

      The idea is that artists can build new worlds, possibly with less effort, thanks to this technology.

    • LifeLiverTransp 1857 days ago
      No... no.. no.. can you imagine how much trickery and hacks can be tossed with this? Finally free. Finally not months spend on a shader for each model. Rendering does not have to be realistic, but even that part is easier with rendering.
      • izzydata 1857 days ago
        So an artist gets to spend far less time creating content by making raytracing doing a lot of the lightning work for them and in order for the end user to benefit from this they have to sacrifice a ton of hard performance and drop their FPS.

        We haven't even gotten to a 4k60fps standard in modern titles and raytracing content will drop people to console like frame rates.

        • LifeLiverTransp 1856 days ago
          Have you any idea what i byzantine nightmare we can cut loose with this? Just look at your download of the nvdia drivers? Every more popular game has one complete rewrite of the shaders in there- some are even delivered defunct, and repaired by nvidia, the repairs added layer, by layer to this gothic cathedral mothership of code.
    • truckerbill 1857 days ago
      '640kB ought to be enough for anyone' This step towards real time raytracing is going to happen sooner or later. This kind of progression is why we aren't stuck with Wolfenstein 3D
      • dragontamer 1857 days ago
        I disagree.

        Raytracing has always been possible. There are GPU demos of real-time raytracing back in 2010. But gamers typically care more about higher-FPS / higher-resolutions rather than more realistic shadows and reflections.

        I think a "killer app" needs to be done. Ex: a game designed around raytracing: a stealth game using shadows to gain information around a corner (and worrying about the placement of your own shadow). That sort of thing...

        For now, game developers have this cool visual technology, but no real way to integrate it into a gameplay mechanic that players really care about.

        • tntn 1857 days ago
          > gamers typically care more about higher-FPS / higher-resolutions rather than more realistic shadows and reflections.

          It's a balance though, right?

          Programmable shading took off because lots of people wanted the better shading that it enabled (at the cost of lower framerates/resolution). So it seems plausible that people might again choose quality improvements through raytracing over framerates/resolution (especially as 4k 120fps becomes more achievable).

          • dragontamer 1857 days ago
            The majority of the demo was probably traditional rasterized graphics. Raytracing isn't "replacing" anything outside of reflections and global illumination in these realtime demos.

            Will video gamers be willing to pay $600+ for GPUs like the 2070, 2080, or 2080 Ti, for a few improved shadows and reflections / refractions?

            I think yes, but ONLY if you get a game to actually make shadows and reflections important. RTX benefits are a very subtle shift that costs a LOT of money right now.

            • blihp 1857 days ago
              Thief: The Ray-Traced Age?
        • thfuran 1857 days ago
          A fully path traced engine basically is its own killer app as far as the game design process goes.
          • dragontamer 1857 days ago
            No one is selling fully path traced engines.

            We're talking about adding RTX features on top of existing engines for a subset of calculations (reflections and global illumination).

            It would be absolutely a killer app to have a fully path-traced engine. But that's 10 years away at the minimum. A $1200 GPU can only do ~1-sample-per-pixel at video game speeds (~60fps).

            The heavy denoising and post-processing means that the majority of your video game graphics will remain rasterized. "RTX" is added on top for specific surfaces (ie: reflections on the water), or specific lights. There's simply not enough GPU power to get anything more than that right now.

    • JabavuAdams 1857 days ago
      Or decent AI?

      EDIT> No point having high-resolution if all it does is show you the puppet strings. NTSC was pretty real, back in the day.

  • devilmoon 1857 days ago
    Am I missing something? This doesn't really seem all that great compared to what we have been shown with Ray Tracing. The fact that it's not even real time but pre-rendered makes it even less wow-ing for me
    • zaroth 1857 days ago
      A visually stunning ray-tracing demo using Unreal Engine 4.22 and powered by a single GeForce RTX 2080 Ti...

      Does this not mean it’s rendered in real-time? Otherwise what’s the point in mentioning it rendered with a single card [in an unspecified time frame]?

      • skrowl 1857 days ago
        I'm pretty sure they don't mean "Playback of this pre-rendered MP4 was powered by a single GeForce RTX 2080 Ti!"
      • MrGilbert 1857 days ago
        "a real-time technology demonstration using Unreal Engine 4.22 ray tracing"

        Directly from the description of the video at Unreal's yt channel.

        Source: https://www.youtube.com/watch?v=Qjt_MqEOcGM

      • devilmoon 1857 days ago
        I don't know, there doesn't seem to be any mention either way, if it's not pre-rendered it is slightly better but still, I don't understand why a packed room at GDC of all places would be wowed by this.
        • VectorLock 1857 days ago
          I think that it uses Unreal Engine entails that its rendered in real-time.
    • 013a 1857 days ago
      Yeah I could use some clarification on whether this is real-time or not. If its real-time, then its among the most impressive tech demos ever showcased. If its pre-rendered then how is it different from the films Pixar and Dreamworks put out every year? Its visually stunning but ultimately you can make computers do anything if you're willing to sit around for a 2 week render process.
      • ska 1857 days ago
        The text suggests it is realtime; Beyond that though, it's hard to see what novelty would lead you to showcase this as a demo if it were offline.
      • sorenjan 1857 days ago
        > To drive a new generation of content creation for film, TV and games, Goodbye Kansas built “Troll” -- using no custom plugins or code -- to showcase the slate of ray-tracing features coming in Unreal Engine 4.22, demonstrating how they can be used to deliver stunning, Hollywood-quality results in real time.

        https://www.unrealengine.com/en-US/blog/troll-showcases-unre...

      • devilmoon 1857 days ago
        Even if not pre-rendered this doesn't seem _that_ good to me to be honest. Some of the things I've seen in BF5 with Ray Tracing were more impressive to me than this whole demo. i.e. when the fairies fly away the lightning on the water seems really off to me, even when they fly around the crown/princess it looks more weird than stunning.
      • bcaulfield 1857 days ago
        Yes, it is real time. Original post will be updated to include that information.
    • tntn 1857 days ago
    • bcaulfield 1857 days ago
      Actually, yes, it's real time
    • make3 1857 days ago
      it is supposed to be real time. for real time it's really good