Unreal Engine 5 fixed

holy shit tim I fricking kneel

Mike Stoklasa's Worst Fan Shirt $21.68

Thalidomide Vintage Ad Shirt $22.14

Mike Stoklasa's Worst Fan Shirt $21.68

  1. 2 weeks ago
    Anonymous

    >fricking AMD super resolution upscaled from 50% res
    what a fricking joke.

  2. 2 weeks ago
    Anonymous

    I dont know what it is about unreal engine, but it always looks bad to me

    • 2 weeks ago
      Anonymous

      You'll take generations of shiny plastic garbage and you'll like it

      • 2 weeks ago
        Anonymous

        This is the opposite of plastic though, the concrete looks like concrete, water looks like water, the metal looks like metal, and only the painted plastic body of the truck looks like plastic

    • 2 weeks ago
      Anonymous

      PBR has been a disaster for video game graphics. Makes everything look the god damn same.

      • 2 weeks ago
        Anonymous

        I hate how easy it is to remove something like PBR in Godot due to environments and material but devs with marvel/consoom brainrot dont give a shit, every fricking rock must have full Principled BSDF settings with absolutely zero art design to actually make a pretty picture

      • 2 weeks ago
        Anonymous

        It looks like shit now and still runs terribly. The lighting instability was bad before but now it's unacceptable.

        Why is PBR the bogeyman and not the counterproductive pursuit of global illumination solutions that have led to massive regressions in shit people actually care about? PBR as a paradigm was what stopped every game from having the plastic UE3 look. It's a net positive.

        • 2 weeks ago
          Anonymous

          Because PBR is how materials react to any lighting. All the other stuff is just extra bells and whistles stacked on top of it.

          • 2 weeks ago
            Anonymous

            Exactly why it fixed the clay/plastic problem. So why do people dislike it? PBR was good. It was the pursuit of the "extra bells and whistles" that came with serious drawbacks.

            • 2 weeks ago
              Anonymous

              Because PBR is the CGI or video games.
              When used lazily it looks awful, which is 99% of use cases. The other 1% still don't look any better than stylized approaches anyways, its just used because its cheap and easy.

      • 2 weeks ago
        Anonymous

        pbr looks fine in every other engine
        unreal is doing something weird with the lighting and motion blur
        it just dont understand why people love it

        • 2 weeks ago
          Anonymous

          PBR looks bad in unity too, because PBR is just a term for a set of specific rendering rules.
          The reason unreal is so popular is its geared toward non programmers, which is like 95% of a games workforce now a days. You can make an entire game in it without ever seeing a line of code.

    • 2 weeks ago
      Anonymous

      TAA + upscaling makes games look more blurry than they need to be

    • 2 weeks ago
      Anonymous

      that’s because „devs“ and „artists“ are lazy nerds. you can turn UE in something incredible if you stop using the dumb unreal store assets

    • 2 weeks ago
      Anonymous

      I agree, it looks like shit. Even worse, it plays like absolute garbage. Every single shooter made in Unreal 4 and 5 feels awful to play.

      • 2 weeks ago
        Anonymous

        >Every single shooter made in Unreal 4 and 5 feels awful to play.
        Amid Evil is fricking great.

      • 2 weeks ago
        Anonymous

        You're delusional or haven't tried any games. Even Fortnite feels better than its AAA peers like Cod, Battlefield 5, and Halo Infinite, all built on different in-house engines made for shooters.

    • 2 weeks ago
      Anonymous

      It's people who don't know how to work with it.
      Kingdom Hearts 3 was made in UE4 and looks completely different from everything else

      • 2 weeks ago
        Anonymous

        kh3 looks awful in unreal, the organization look like literal plastic. the luminous engine looked infinitely better

  3. 2 weeks ago
    Anonymous

    >4090
    >needs HALF RES TSR to hit >50
    high end PC gaming is bullshit man, I'll stick to my 1060

    • 2 weeks ago
      Anonymous

      This is why i'll never take PC gaming seriously ever again. Man do i regret the old days.

  4. 2 weeks ago
    Anonymous

    wow that grimy ass bridge looks slightly better

  5. 2 weeks ago
    Anonymous

    i will kneel with you as well OP.

    • 2 weeks ago
      Anonymous

      FRICK OFF YOU TOOL!

  6. 2 weeks ago
    Anonymous

    Engines don't look bad, games that lazily use all the defaults for shaders and lighting and don't have good texture work do.

    • 2 weeks ago
      Anonymous

      meant for

      I dont know what it is about unreal engine, but it always looks bad to me

    • 2 weeks ago
      Anonymous

      the defaults look bad

      You'll take generations of shiny plastic garbage and you'll like it

      it used to look far worse in the UE2-3 days
      UE4 and 5 look better, but it still doesn't look as natural as frosbite or cryengine 2/3 did

  7. 2 weeks ago
    Anonymous

    Fricking Fortnite the real game benchmark of UE5 still needs multiple matches to compile shader cache every time you update drivers with the very first match being completely unplayable. .
    I can't believe this kind of thing is still a problem, holy shit.

    • 2 weeks ago
      Anonymous

      shader compilation stutter is the most dogshit part of every modern game nowadays specially japanese games that forget about it. Fricking DirectX12.

  8. 2 weeks ago
    Anonymous

    Still stutters worse than G-Man.

  9. 2 weeks ago
    Anonymous

    >UE5.4 is still fricked
    >look guise it's fixed
    ???

  10. 2 weeks ago
    Anonymous

    This means nothing because devs won't optimize for shit. Newest AAA slop will run like the left still, because devs now think they can shove more random bullshit in.

  11. 2 weeks ago
    Anonymous

    the comparison matrix section claims its the same settings, yet the overhead lighting has weird low resolution blooming artifacts and the character model isn't receiving any indirect lighting.

    • 2 weeks ago
      Anonymous

      Watch the whole video, it's explained.

  12. 2 weeks ago
    Anonymous

    >Foregone Destruction playing
    I'll never forgive Epic for backstabbing Unreal.

  13. 2 weeks ago
    Anonymous

    what's TSR? Is it AMD's DLSS or something?

    • 2 weeks ago
      Anonymous

      It's Unreal's built-in temporal upscaler, like DLSS.

  14. 2 weeks ago
    Anonymous

    >53 FPS at 50% of 4k on a fricking 4090
    >improved to 82 FPS still 50% of 4k on a 4090
    This thing is a fricking abomination, this game engine is an utter disaster.

    • 2 weeks ago
      Anonymous

      it's ray traced

    • 2 weeks ago
      Anonymous

      M8 there are UE5 games out right now that run much better than that, they are specifically picking an unoptimized demo to use as example.

      • 2 weeks ago
        Anonymous

        Which are these UE5 games that run well? I remember that shitty fantasy shooter which ran on UE5 and was an utter fricking disaster, pretty sure consoles dropped below 720p even. There was also Remnant 2 and that got like 40-something FPS at 4k with a 4090 and it didn't even use RT. I'm sure UE5 runs great if you're going to use it for a retro-inspired pixel graphics 2D game or something.

        • 2 weeks ago
          Anonymous

          Robocop.

          • 2 weeks ago
            Anonymous

            I mean, it's better than the utter disasters that are other games but barely scraping over 60FPS on a 4090 while still having lows under 50FPS and a 4080 or a 7900XTX dropping in the 30s still isn't particularly impressive, all without RT once again.

            • 2 weeks ago
              Anonymous

              It's 60 fps on 4k with no upscaling on a game running on latest gen tech though. Here's another one.

              • 2 weeks ago
                Anonymous

                Drops in the 40s on a 4090 aren't impressive performance if it's not even using "latest gen" features like RT.

                bullshit, thats not unreal only problem, entire generation seems to shit itself on the cpu and vram utilization, games from games running on unreal to games to games running on in house engines like last of us port or flopspoken
                its like entire industry failed to predict how much faster will gpus develop than cpus, but we are so far into it, there is no excuse now

                Pretty much everything I play is absolutely murder on the GPU at 4k. In some games I could probably use like 3x or even more performance than what a 4090 provides, to run them at good FPS and native 4k without upscalers. I don't know how the frick you ended up with the conclusion that games are now CPU limited when we're at peak popularity of upscalers on PC precisely because games are so GPU heavy you can no longer run them at native res. This IS often correlated with RT use which is just extremely heavy, shit like UE5 running so poorly but without RT isn't that common, at least among the games I play. I don't touch shit like Last of Us or whatever the frick, so maybe that's why.

              • 2 weeks ago
                Anonymous

                >it's not even using "latest gen" features like RT.
                It does use RT though, through Lumen.

              • 2 weeks ago
                Anonymous

                Lumen can run in a shittier, software-only mode with no actual RT and no RT hardware acceleration. According to articles I've read on those games mentioned, that is exactly what is happening and no actual RT is being used.

                I have a 4K and 1440p monitor do I can run really demanding games on native 1440p without stupid scaling bullshit at reasonable frames

                You will likely still need some upscaling or frame gen for the most demanding RT games, like Cyberpunk path tracing and so on even at 2560x1440.

              • 2 weeks ago
                Anonymous

                I have a 4K and 1440p monitor do I can run really demanding games on native 1440p without stupid scaling bullshit at reasonable frames

              • 2 weeks ago
                Anonymous

                >Pretty much everything I play is absolutely murder on the GPU at 4k.
                i got a 4080 super which isnt that much slower and i cant lock 60 fps due to stutter and frame drops, i got 120 fps one moment then 40 another and it pisses me off to no end
                upscalers can be used for a lot of things, it lower the vram usage, so if you bottleneck on nvidia cucking you with low vram you can try upscaling
                and yeah, some games run great, played dead island 2 recentlty, one of the ebst games ever made and runs great(ironically its on unreal engine ...), but a lot of AAA games run like ass

  15. 2 weeks ago
    Anonymous

    >still has shader comp stutter and frametime spikes on PC
    lmaoooooooooooooo

  16. 2 weeks ago
    Anonymous

    The best thing about UE5 is having software RT to somewhat counteract Nvidia's homosexualry.

  17. 2 weeks ago
    Anonymous

    >Unreal Engine 5 fixed
    Not even close.
    Its still CPU limited, you have games running at 30fps when set to 60fps with 30% gpu utilization.
    Its insane, whats the point of getting a card like 4080rtx+ when games cant even use it?

    They have no excuse, cpu clocks stopped going up and started maxing on threading a decade ago, why are games till 1 thread limited?

    • 2 weeks ago
      Anonymous

      >whats the point of getting a card like 4080rtx+ when games cant even use it?
      I mean, some games certainly can, not everything is running on unreal shitshow 5.

      • 2 weeks ago
        Anonymous

        bullshit, thats not unreal only problem, entire generation seems to shit itself on the cpu and vram utilization, games from games running on unreal to games to games running on in house engines like last of us port or flopspoken
        its like entire industry failed to predict how much faster will gpus develop than cpus, but we are so far into it, there is no excuse now

        • 2 weeks ago
          Anonymous

          Games relying more on the CPU than the GPU is a good thing because the GPU market is complete horseshit whereas a 7800X3D is perfectly affordable.

          • 2 weeks ago
            Anonymous

            Not him but it's the complete opposite.
            You can lower the resolution or other settings and you still won't improve the game performance if you are CPU limited. You will still get major stutters. CPUs get barely better every gen, 10-15% at best. If game runs like shit on 7800X3D imagine how bad it is on average PC.

            • 2 weeks ago
              Anonymous

              >You can lower the resolution or other settings and you still won't improve the game performance if you are CPU limited.
              True for resolution but not for all settings at all, you just can't use the same techniques you do for a GPU bottleneck.
              >CPUs get barely better every gen, 10-15% at best
              Completely wrong, Zen4 was a massive IPC improvement over Zen3. How's the GPU market in the meantime? Oh, the 4060 and the 7600 don't even beat the 3060ti and the 6700? The 4060TI and the 7700XT are overpriced as shit? The 7800XT is literally just a 6800XT refresh? The 4070ti, 4080, 7900XT and 7900XTX were such dogshit value that they needed Super refreshes and price drops? The 4090 can't even be found for the fricktarded $1600 MSRP?

              Meanwhile I got my sweet ass 7800X3D on the cheap microcenter bundle and it crushes any game I throw at it. This chip sells for $350. That's the single best gaming CPU, for a price that doesn't even get you more than mediocre garbage on the GPU market.

              >stutters
              In this instance it's specifically a shader compilation issue which isn't CPU related btw.

              • 2 weeks ago
                Anonymous

                Zen4X3D vs Zen3X3D isn't huge jump at all. IPS improvement is just a part of the equation, you got also higher frequency and better memory. And normal Zen4 is not even worth considering it was barely any faster than 5800X3D.
                Just because both AyyMD and nV fricked us over with how they named their products doesn't mean there is no jump in performance.
                Demand for ML raises the price of whole GPU stack. 4090 vs 3090 is still huge jump even if both were overpriced as frick.

                >In this instance it's specifically a shader compilation issue which isn't CPU related btw.
                Except when there are other kinds of stutter as well, mostly associated with traversal/asset streaming and decompression. RT is also putting more load on CPUs.
                Some games are fricked even beyond that.
                Average framerate isn't worth much if it's not stable.
                Nearly every new AAA open world game suffers from stuttering and it's not only UE issue.

              • 2 weeks ago
                Anonymous

                >Zen4X3D vs Zen3X3D isn't huge jump at all. IPS improvement is just a part of the equation, you got also higher frequency and better memory.
                X3D chips respond much less to faster RAM than usual Zen does actually.
                >normal Zen4 is not even worth considering it was barely any faster than 5800X3D.
                How is that not "worth considering" when the midrange 7600 is matching the 5800X3D which was the top CPU of the previous gen? Contrast that to the 4060 barely even budging performance compared to the 3060.
                >Demand for ML raises the price of whole GPU stack. 4090 vs 3090 is still huge jump even if both were overpriced as frick.
                And thus my point that the GPU market is complete bullshit right now and I'd rather have games that care more about the CPU because getting a good CPU for gaming is easy whereas the GPU market is an upsell-focused hell comparable to whale targeting in live service games.

              • 2 weeks ago
                Anonymous

                >X3D chips respond much less to faster RAM than usual Zen does actually.
                This depends on the game used to test. Some scale better with faster memory or even work better on Zen4 nonX3D due to higher frequency, it's rare but it happens.
                Doesn't matter much because you won't be buying high end RAM for Zen4 anyway, decent DDR5 6000 is all you need.
                >How is that not "worth considering" when the midrange 7600 is matching the 5800X3D which was the top CPU of the previous gen? Contrast that to the 4060 barely even budging performance compared to the 3060.
                It wasn't worth considering because the cost of whole new platform wasn't worth the performance gain. At least now you can get cheaper DDR5 but good motherboards are still expensive. Where as you could simply buy 5800X3D and have beast CPU even with your old ass B350 board.
                Low end GPU were a scam since the dawn of GPUs. "4060" is trash and it should not be even called that to begin with. Radeon 7600 is also cashgrab.
                >And thus my point that the GPU market is complete bullshit right now and I'd rather have games that care more about the CPU because getting a good CPU for gaming is easy whereas the GPU market is an upsell-focused hell comparable to whale targeting in live service games.
                You are still missing the point I was making, you can easily lower the graphics settings and improve your FPS in GPU bound scenario. There is no fix for bad CPU performance in games beyond rare "crowd density" or draw distance setting.

              • 2 weeks ago
                Anonymous

                >It wasn't worth considering because the cost of whole new platform wasn't worth the performance gain.
                Obviously no one is buying an AM5 mobo + RAM + 7600 to replace a fricking 5800X3D. But if you are instead on an older chip, you can instead choose to either upgrade to the 5800X3D OR go AM5 instead which actually isn't even that expensive right now. Microcenter has a 7700X bundle for $342. You pay a bit more, but get DDR5 ram and a path to future CPU upgrades. At any rate this is just another example of how the CPU market mogs the GPU market, you have actual choices.
                >Low end GPU were a scam since the dawn of GPUs.
                Not really no, the 1060 was as good as the 980. It's only from Turing onward that Nvidia decided to systematically kill this market segment.
                >You are still missing the point I was making, you can easily lower the graphics settings and improve your FPS in GPU bound scenario.
                But you can't get a GPU that doesn't suck ass without getting robbed which is my point.
                >my CPU can't run this game
                Time for an upgrade, hey cool, the best chips cost a fraction of what the best GPUs cost.
                >my GPU can't run this game
                Lower settings all you want but at the end of the day you still need to actually have a GPU to begin with, and your current choices range from underperforming garbage to overpriced bullshit.

              • 2 weeks ago
                Anonymous

                >Microcenter has a 7700X bundle for $342. You pay a bit more, but get DDR5 ram and a path to future CPU upgrades.
                Not everyone lives next to microcenter, good for you.
                >Not really no, the 1060 was as good as the 980. It's only from Turing onward that Nvidia decided to systematically kill this market segment.
                During pascal era the low end was 1030 not 1060. You had 1030, 1050, 1050Ti and then 1060. But even 1060 had shit 3GB version with lower performance. But even those cards were quite expensive during the first cryptoboom.
                >But you can't get a GPU that doesn't suck ass without getting robbed which is my point.
                I just sucked AyyMDs wiener and bought 7900XT.
                >Time for an upgrade, hey cool, the best chips cost a fraction of what the best GPUs cost.
                Waiting for 9800X3D, I don't upgrade without getting at least 2x the performance. It won't be cheap if intel keeps fricking around.
                >Lower settings all you want but at the end of the day you still need to actually have a GPU to begin with, and your current choices range from underperforming garbage to overpriced bullshit.
                Yet it's still preferable to playing games where you can't do shit to fix performance because there are no CPUs fast enough.
                We are bruteforcing decades of backward compatibility baggage.

    • 2 weeks ago
      Anonymous

      >whats the point of getting a card like 4080rtx+
      I don't understand, you literally said it's CPU bottlenecked. I always see posts like this online
      >oh this game sucks why can't I run it on my 4080??????
      >what's your CPU?
      >uh idk an i5 something but its really good and was expensive 5 years ago
      gpus aren't the magic box that makes games work

      • 2 weeks ago
        Anonymous

        the difference is that cpu base clocks dont change, and have extra 8 threads buying a new one dosnt help when games bottleneck on the first thread
        even ops video shows that best cpus in world bottleneck the same way midrange from few years ago does

  18. 2 weeks ago
    Anonymous

    I'm so fricking glad Nintendo is releasing THE most reasonable console ever in history right as Xbox and Sony are slamming car doors with their dicks, because they're about to eat literally every fricking single third party developer who will ALWAYS prefer to develop games with more reasonable fidelity for the Switch 2, rather than deal with the mess that is "next gen gaming" (which is gimped by the Series S anyways lmao). May the AAA industry fall and may the second video game crash cause buildings to catch fire.

    FRICK DirectX12. FRICK Unreasonable graphical demands. FRICK insane AAA development cycles with nothing but contractors who get laid off while games are directed by executives and stakeholders. FRICK AAA gaming. FRICK always online FOMO shit. FRICK shit art design. FRICK marvel consoomer culture. AND FRICK SONY AND MICROSOFT. (and FRICK Nintendo too, they can get some shit as well)

  19. 2 weeks ago
    Anonymous

    People will never get it.
    Performance gains are absolutely pointless because developers will use said gains to cover further bloat and laziness, and then you are back and square one.

Your email address will not be published. Required fields are marked *