>Just because your shitty pc can't run it doesn't mean it's so poorly optimized.
Not the guy you're replying to but if these games aren't poorly optimized, no game is unoptimized.
What do you call a game that doesn't really look better, doesn't have better AI, simulation, interactivity or complexity than games that came out years prior yet runs ridiculously bad?
If a game's performance isn't proportional to what it's offering the user it's just unoptimized.
>Game running at 45 fps because UE5 >Character runs into next room >300 ms stutter as shaders compile for new room >Back to 45 fps because UE5 >Character casts new spell >200 ms stutter as shaders compile for new casting effect >back to 45 fps because UE5 >Textures look blurry because your card only has 16 GB of VRAM instead of 24 GB. >CPU running at 95 C because UE5 does everything on the CPU. >Devs require DLSS or FSR because UE5
How does it? It brought tons of actually useful innovation.
Blame the nu-game devs we've got thanks to decades of horrible work conditions and clueless management. Literally no reason to go into game dev nowadays if you're remotely competent in programming.
Blame the Black folk who live through buzzwords and cannot make a game without putting all the new fancy UE5 features in it.
Blame the Black folk who can't read documentation and change the default settings of their project.
Blame the Black folk who can't work without delegating optimization to someone else; in this case to AMD/Nvidia/Intel.
Most games don't need nanite, lumen or anything of the sort, especially linear games with static scenes. Lumen/RT is the antithesis of static scenes, just use baked lighting for frick's sake.
>Most games don't need nanite, lumen or anything of the sort
Then why does this engine have them? You just know devs will prioritize graphics over performance.
Why wouldn't they prioritize graphics? That's literally the point of moving forward with a new engine. Stop being poor and upgrade your pc or shut the frick up.
Actually decent opinion which I didn't expect from the usual dumbasses who hate on UE cause "epic bad". Frostbite is God tier IMO when used correctly. Battlefield 1/5 and Battlefront II are still such jawdroppingly good looking games and they run so smoothly too. Shame nu-dice has no idea how to actually use the engine.
It makes no sense to quote physics engines next to game engines.
A physics engine is one of many middlewares used within a game engine. Physics engines do not make games work on their own.
Also, Havok Physics is still the go to physics engine.
>RDR2 ran like shit on 4-5 year hardware, like how we're complaining RTX 2000 cards, 4 years old, run like shit
>Game running at 45 fps because UE5 >Character runs into next room >300 ms stutter as shaders compile for new room >Back to 45 fps because UE5 >Character casts new spell >200 ms stutter as shaders compile for new casting effect >back to 45 fps because UE5 >Textures look blurry because your card only has 16 GB of VRAM instead of 24 GB. >CPU running at 95 C because UE5 does everything on the CPU. >Devs require DLSS or FSR because UE5
>Developer problem not optimising their game >User problem thinking they can play 4k Ultra on a 1060
You know Epic have fixed shader stutter right? Only level streaming/asset loading (almost unavoidable) or developer incompetent causes FPS stutter.
>Most games don't need nanite, lumen or anything of the sort
Then why does this engine have them? You just know devs will prioritize graphics over performance.
Because it's not about developer laziness, it's about how to move games to the next level visually. "More" is the only way, and these two technologies allow this. Whilst demanding, if properly optimised, those technologies can run on low-end hardware.
We will never move beyond material quality of The Order: 1886, for example.
>RDR2 ran like shit on 4-5 year hardware
It ran 30 fps on a GT 750 Ti, a low end 30W card that was made 5 years before the release of the game. >User problem thinking they can play 4k Ultra on a 1060
It runs sub 40 fps on a 1060 at 720p low. >it's about how to move games to the next level visually
Why would you move games to the next level when the hardware clearly can't handle it? RTX 4000 series is already built on 4nm nodes, we can't get better than this. Moore's law is literally dead.
>It ran 30 fps on a GT 750 Ti, a low end 30W card that was made 5 years before the release of the game.
It barely ran stable at 25 FPS in open world at 1080p let alone 30fps. In towns, it was crapping to 15. If you want to move the goalpost, 1060 runs Immortals similarly at 1080p, albeit, older card between title releases.
>It runs sub 40 fps on a 1060 at 720p low.
Just like how your 750ti ran RDR2. 1060 is a 7 year old GPU ffs. Move on.
>Hardware cannot handle it
Slightly true. It is both incompetence from Developers for not optimising, and the demanding nature of the software.
>Moore's law is literally dead
That's why we have upscalers. What I don't like, is when Developers are too lazy and are reliant on upscalers.
So, what's the next UE5 game?
Thus far the only new games using UE5 run like absolute ass despite not looking impressive in any way. Remnant II and Immortals of Aveum. Let's hope this doesn't become a trend.
STALKER 2 and Avowed aside, mostly AA slop from no name studios with wholly unoriginal gameplay. There's not much to be excited for. AAA devs tend to use in-house engines.
The next "Full" UE5 game is Lords of the Fallen, which, runs miles better than Remnant II or Immortals, but still will make Ganker seethe that their 1060 is obsolete.
Devs aren't forced to use it, blame developers for making shitty decisions, so many studios that could build their own engine and competently who just use Unreal, you don't save time using Unreal unless you are a very small studio.
I hate Tim Chink Sellout so much.
Ah yes more uninformed twats sharing their opinion of video game development with their zero years of experience
And more devs releasing Unreal 5 games with zero optimization.
sheeesh
That sounds like a developer problem.
Name literally one game, homosexual.
Matrix Tech Demo
Immortals of Avenum
And that's an engine problem why? UE5 looks gorgeous.
Just because your shitty pc can't run it doesn't mean it's so poorly optimized. Plus they need time to become experienced with it you moron.
>Just because your shitty pc can't run it doesn't mean it's so poorly optimized.
Not the guy you're replying to but if these games aren't poorly optimized, no game is unoptimized.
What do you call a game that doesn't really look better, doesn't have better AI, simulation, interactivity or complexity than games that came out years prior yet runs ridiculously bad?
If a game's performance isn't proportional to what it's offering the user it's just unoptimized.
>there are actual unreal engine shills in the thread
So you're saying the devs need more time to optimize their unoptimized game
>Game running at 45 fps because UE5
>Character runs into next room
>300 ms stutter as shaders compile for new room
>Back to 45 fps because UE5
>Character casts new spell
>200 ms stutter as shaders compile for new casting effect
>back to 45 fps because UE5
>Textures look blurry because your card only has 16 GB of VRAM instead of 24 GB.
>CPU running at 95 C because UE5 does everything on the CPU.
>Devs require DLSS or FSR because UE5
UE4 has an inbuilt upscaler thats about equal to FSR or even intels XeSS
but you wouldnt know that being a moron. 2500k never obsolete right?
Ah yes the pseud who's neve made a game anyone has played or liked before thinking he matters
here's your game engine bro
How does it? It brought tons of actually useful innovation.
Blame the nu-game devs we've got thanks to decades of horrible work conditions and clueless management. Literally no reason to go into game dev nowadays if you're remotely competent in programming.
Blame the Black folk who live through buzzwords and cannot make a game without putting all the new fancy UE5 features in it.
Blame the Black folk who can't read documentation and change the default settings of their project.
Blame the Black folk who can't work without delegating optimization to someone else; in this case to AMD/Nvidia/Intel.
Most games don't need nanite, lumen or anything of the sort, especially linear games with static scenes. Lumen/RT is the antithesis of static scenes, just use baked lighting for frick's sake.
>just use baked lighting for frick's sake
Digital Foundry will crucify you.
Digital homosexuals.
>Digital foundREEEEEE
>Most games don't need nanite, lumen or anything of the sort
Then why does this engine have them? You just know devs will prioritize graphics over performance.
>Then why does this engine have them?
This post is either bait or mental moronation.
Why wouldn't they prioritize graphics? That's literally the point of moving forward with a new engine. Stop being poor and upgrade your pc or shut the frick up.
Works on my machine
Red Engine was the best engine ever, followed by Snowdrop and Frostbite.
Eat my ass, you know I'm right.
Also, CryEngine gets an honorable mention because of Ryse: Son of Rome.
All of you can suck my veiny wiener.
Actually decent opinion which I didn't expect from the usual dumbasses who hate on UE cause "epic bad". Frostbite is God tier IMO when used correctly. Battlefield 1/5 and Battlefront II are still such jawdroppingly good looking games and they run so smoothly too. Shame nu-dice has no idea how to actually use the engine.
I was literally being a frickhead and just randomly naming engines, but if you honestly approve then fine.
I was *this* close to saying Unity just to flame hard but I thought I'd be a little bit more subtle.
CryEngine is just too hard to work with.
>Red Engine
Kek. It's ubishit tier.
>Frostbite
>CryEngine
Garbage performance.
>Snowdrop
Decent graphics but nothing else to offer.
The best engine title belongs to:
>open world: Avalanche 2.0
>environmental destruction: Geo-Mod 2.0
>linear game: idTech 7
>sandbox RPG: Creation Engine, unironically
>physics: Euphoria, Bugbear engine
It makes no sense to quote physics engines next to game engines.
A physics engine is one of many middlewares used within a game engine. Physics engines do not make games work on their own.
Also, Havok Physics is still the go to physics engine.
>RDR2 ran like shit on 4-5 year hardware, like how we're complaining RTX 2000 cards, 4 years old, run like shit
>Developer problem not optimising their game
>User problem thinking they can play 4k Ultra on a 1060
You know Epic have fixed shader stutter right? Only level streaming/asset loading (almost unavoidable) or developer incompetent causes FPS stutter.
Because it's not about developer laziness, it's about how to move games to the next level visually. "More" is the only way, and these two technologies allow this. Whilst demanding, if properly optimised, those technologies can run on low-end hardware.
We will never move beyond material quality of The Order: 1886, for example.
>RDR2 ran like shit on 4-5 year hardware
It ran 30 fps on a GT 750 Ti, a low end 30W card that was made 5 years before the release of the game.
>User problem thinking they can play 4k Ultra on a 1060
It runs sub 40 fps on a 1060 at 720p low.
>it's about how to move games to the next level visually
Why would you move games to the next level when the hardware clearly can't handle it? RTX 4000 series is already built on 4nm nodes, we can't get better than this. Moore's law is literally dead.
>It ran 30 fps on a GT 750 Ti, a low end 30W card that was made 5 years before the release of the game.
It runs at 30FPS on 1080P low with FSR performance aka 540P*
>It ran 30 fps on a GT 750 Ti, a low end 30W card that was made 5 years before the release of the game.
It barely ran stable at 25 FPS in open world at 1080p let alone 30fps. In towns, it was crapping to 15. If you want to move the goalpost, 1060 runs Immortals similarly at 1080p, albeit, older card between title releases.
>It runs sub 40 fps on a 1060 at 720p low.
Just like how your 750ti ran RDR2. 1060 is a 7 year old GPU ffs. Move on.
>Hardware cannot handle it
Slightly true. It is both incompetence from Developers for not optimising, and the demanding nature of the software.
>Moore's law is literally dead
That's why we have upscalers. What I don't like, is when Developers are too lazy and are reliant on upscalers.
Then don't use a 1060. It's so fricking old. What do you expect?
can you provide publicly available source code or SDK for these engines
if not you can shut the frick up
Red Engine was complete shit.
for me, it's 2D jiggling breasts in Unity.
What's wrong with it? When considering an average performance of games made on this, next to un*ty it's fricking perfect.
>*holds back gaming
Explain Northern Journey then
So, what's the next UE5 game?
Thus far the only new games using UE5 run like absolute ass despite not looking impressive in any way. Remnant II and Immortals of Aveum. Let's hope this doesn't become a trend.
STALKER 2 and Avowed aside, mostly AA slop from no name studios with wholly unoriginal gameplay. There's not much to be excited for. AAA devs tend to use in-house engines.
The next "Full" UE5 game is Lords of the Fallen, which, runs miles better than Remnant II or Immortals, but still will make Ganker seethe that their 1060 is obsolete.
>which, runs miles better
Where did you play the game
I want an end to LODs and I'm willing to sacrifice every poorgay to do it.
I hate unreal engine so much it's unreal 1998
Devs aren't forced to use it, blame developers for making shitty decisions, so many studios that could build their own engine and competently who just use Unreal, you don't save time using Unreal unless you are a very small studio.
The only thing holding back gaming is brown people like you who think an i5 and a 1060 should run anything at 60 fps.
it should you absolute stupid fricking moron
>holds back white males
You homosexuals have been saying this since the renderware engine and you've been wrong every time