When they'll start getting games? This doesn't even feel like a new generation.
Most new games come out on both old gen and current gen. PS5 has no games. Xbox has no games.
PS5 feels like a more fleshed out version of PS4 Pro at this point, same thing with Xbox Series (they even have the same UI), they don't feel like new consoles.
This is one of them
"Next-gen" is just an excuse not to optimize the game. You don't have to worry about a PS4, so now the baseline is the PS5 and you can push out these unoptimized turds that barely look any better (sometimes worse) than games released years ago and run like absolute shit
Gotham Knights, Redfall, Forspoken, FF XVI, Remnant 2, Immortals of Aveum, the list goes on
Maybe snoy first party devs will actually optimize their games, but third parties are just spamming next gen slop after next gen slop
They never will feel like new consoles again. We will never go from super Nintendo to Nintendo 64 again. Now it’s just a whole lot of squinting and digital foundry reviews to convince me that yes that shadow does look better than it did before
When I can walk into my local store in america and actually see them and not a placeholder sign telling me to order online. Switch is in stock by the bunches right now, PS4 and xbox one were in stock to sit on the shelf for someone walking by to purchase when it was this long after their release date.
Holy what the frick, I said in another thread that it's 720p as a joke, I didn't think it was actually 720 upscaled LOL
Makes me wonder now how PCs are running this at the same settings.
No it doesn't you dumbfrick. To hit 60fps, the PS5 need to res scale Fortnite heavily too. IIRC it's basically a little bit more than 1080p upscaled to 4K with TSR. It's barely better than this.
>720p upscale to 4K (a resolution 9 times higher)
wtf I thought that was just a meme??
This actually happens now? They just use FSR ultra performance and call it a day?
This has been debunked, actually.
A console is only current gen when the previous one has been completely dropped by game devs.
In 2021 and 2022 virtually all major games came out for PS4. In 2023 (and next year) there are less, but still plenty - Street Fighter VI, Atomic Heart, Diablo IV, Resident Evil 4, Dead Island 2, CoD MWIII, etc.
2023 and 2024 are the last years of relevance for the PS4 and the first big years for the PS5. PS5 should only be current gen in 2025.
Console transitions are longer and longer
Jesus that is better than android lol, my phone already got dropped support
maybe they are, maybe they aren't
phones aren't consoles, you don't buy a new console every year
In terms of gaming currently the iphone 11 with the a13 bionic is consideres the minimum for newer games like the new assasins creed jade game. And for android phones it is ones with an 865 or better.
You do realize you're supposed to use DLSS right? It's right there in the system requirements.
This is a revolutionary cutting edge next-gen game. It wasn't meant to run on your gtx 1050 ti poorgay
Did you know that all modern games scale their lighting, textures, and more based on resolution. So DLSS = lower setting, even if you're playing on ultra.
DLSS is legitimately a fricking handicap on development teams to give them an excuse to not optimize their fricking games. Now the onus is on the consumer to just "have the latest hardware bro" so you can get 40fps out of max settings on the most expensive card on the market. Frick you and frick lazy ass developers. If a game requires DLSS to achieve a decent frame rate (see 60 or above), I won't be playing it.
>You do realize you're supposed to use DLSS right?
I think this chart numbers have been taken with DLSS on. The game is too damn taxing and definitely not worth your time or your money
>You do realize you're supposed to use DLSS right?
No game should need upscaling tech to run at resolutions that were meant to be the native norm 10+ years ago.
same shit you know what he means
devs got this huge time saving tool that allows them to skip optimizing their games and just slap dlss/fsr and say that it runs well
I got a 3070 and completely gave up trying to get the latest tech when I saw it couldn't even maintain 144fps 1440p on apex legends. I thought that would have been a decent graphics card to jump to from console to truly feel like the master race, but no, it struggles with a ton of games. And I feel even more annoyed that dlss is carrying everything. Massive price spikes but not proportional performance unless using AI crutch. So was my graphics card actually garbage and all these devs just use AI to mask the weakness?
Back to the point - I'm not upgrading my rig unless I need to for work or some shit. If things break, I'm just going to stop playing vidya. I ain't spending any money on graphics cards again. That's how bitter I feel about this lol
Why???
SeS has all the next gen featured such as Quick Resume for example, and all older games from older generations are running much better.
Most of the modern games run decent on it, sure, there's stuff like Baldur Gate 3, but it's an extremely small proportion.
Realistically it's not any worse than PS5/XSX
436p on a 1080p TV (what you're supposed to use with a Series S) looks dreadful, but proportionally 720p on a 4K TV looks pretty disgusting too.
>your brand new 4k card is now your 1440p card >your brand new 1440p card is now your 1080p card
hopefully you didn't waste any money on an expensive monitor Ganker
>UE5 runs
UE5 runs fine, devs are being fricking moronic and putting unoptimized meshes with like 8 million polygons each everywhere and expecting it to just work.
There's nothing wrong with the consoles
A PS5 had much better hardware for 2020 than the PS4 had for 2013.
Developers nowadays are just lazy and/or incompetent.
Wow, look at those cool white particle effects! Upscaling artifacts? Wrong, it looks just as good as native. These are just magic particles that randomly appear when you're moving the camera, very important part of the game's story.
>"Realistic" games look soulless and manufactured >"Realistic" games are poorly optimized, can barely run on cutting edge PCs >"Realistic" games bankrupt developers, encourage more sequels and mediocrity >"Realistic" games take up 100GB+ of space in 2023 >"Realistic" games cost more to make up the difference, $70 + microtransactions
Are lazy/incompetent devs the issue or is it Unreal Engine 5? UE5 is pumping out this slop that looks like your average PS4/XBone game but barely runs on a PS5/Series X? It makes no sense, at that point just use a different engine.
It's both
UE5 without Lumen and Nanite should be as fast as UE4, but the whole point of UE5 is all the work they did to implement this perfect geometry and lightning system.
But it's a meme, way too ambitious for today's hardware.
Even if we had the hardware for it (and we don't not even a 4090 can do 4k/60 in this game), I genuinely don't think it's worth it. The lighting/geometry provided by lumen and nanite cause a massive performance hit with very little benefit. Even if the hardware could run it you'd be better off using its power on something else.
I'm gonna be honest here, I'm getting a bit nervous that we're going to see a ton of slop that barely runs on consoles in the next few years - despite looking no better than PS4 games.
And Series S is kill if this is the road we go down too.
>UE5 without Lumen and Nanite should be as fast as UE4
It doesn't, UE5 without the new features is slower then the last version of UE4, that doesn't factor in stuff you can't opt out of like the new Chaos physics that runs worse then the old physx did.
UE4 when it launched was also a hot mess, it wasn't until somewhere around 4.5-4.8 was when the community settled on UE4 being "production"
ready. We are just repeating the same mistake again.
Performance is worse and on top of that most of the new tools are in shit shape with terrible documentation and riddled with bugs, trying to work with the engine as it is now makes me want to jump off bridge.
UE5 and FSR gave them enough reasons to be lazy and not optimize their shit anymore.
You know there will be enough morons defending this shit. Welcome to the next gen.
I thought UE5 was supposed to have this super cool revolutionary rendering system that effortlessly juggles a trillion polygons simultaneously and requires no optimization, yet somehow games made with it just rape all hardware
>yet somehow games made with it just rape all hardware
and none of them look anywhere near as good as the tech demos
those that do I imagine will be running sub-480p on PS5 and Series X
Series S will run at 120p kek
The crazy thing is, games that could run on PS4, with better resolution and performance, looked better than this. A literal downgrade with next-gen is a new low for gaming.
>The crazy thing is, games that could run on PS4, with better resolution and performance, looked better than this. A literal downgrade with next-gen is a new low for gaming.
That's the best way I've seen it put lmao, not only are console-sisters no longer getting mind blowing upgrades with each generation, they're actually getting downgrades.
No especially the lighting geometry and the amount of things actually going on in the map. Maxed out on a 4090 and the game looks pretty impressive, too bad it barely runs on most systems due to ue5 being too taxing. In a sense games can’t really look any better due to the performance bump not being high enough for the increased resolution people want 1440p and 4k. GPUs really need to get better. And 1080p should have been the standard for a bit longer.
10 months ago
Anonymous
Just because there's more things going on doesn't mean it looks better. The combat in particular looks like a complete fricking mess.
> Have you seen console footage for Immortals of Aveum?
Yea, 720p is bad but even so. Looks more impressive
10 months ago
Anonymous
doesn't look any better to me
10 months ago
Anonymous
Still think RDR 2 looks better, even on that screenshot that heavily favours IoA.
10 months ago
Anonymous
How specifically are we defining "looks better" here? Cause I could name a dozen SNES games that, aesthetically, "look better" than Avum. But even from those tiny screenshots I can see RDR2's textures and lighting not measuring up to Aveum
10 months ago
Anonymous
By aesthetics of course. If you're pushing the hardware to such a point that you're getting 720p on the premium consoles, you damn well better be doing something special with it, or else you're just a laughing stock.
the response to FSR3 is so bizarre to me
when DLSS3 came out a lot of people discredited it because there's a latency penalty and at low FPS it looks and feels bad, now I've heard people saying they can't wait to use FSR3 on their steam deck or consoles, systems that end up running games closer to 30 fps than 60 fps
open source isn't a silver bullet for anything, it doesn't matter if a software solution is open source if it's not an effective solution for a problem
FSR3 not magic, people hyping up FSR3 are misunderstanding how the technology works
for example FSR uses async compute which means that if a game also uses async compute FSR3 will perform worse than DLSS 3 in the same scenario since DLSS 3 uses optical flow accelerators to generate the fake frames
the other downside to using async compute is that old ass GPUS like pascal have terrible async compute compared to newer GPUs which is why FSR3 it won't work on cards like the 1060 or older because they have terrible async compute performance
There is also anti-lag+ only on RDNA3. Guess something in the HW of RDNA3 allows some latency reduction like Nvidia pulls off. I think FSR3 is going to be kinda unusable on non RDNA3 cards unless it's a turned based game or something, saying it works on everything is just marketing.
10 months ago
Anonymous
I always turn off anti lag. It kills my experience when it's on. Maybe it's just me though.
>the other downside to using async compute is that old ass GPUS like pascal have terrible async compute compared to newer GPUs
That's on Nvidia, older AMD GPUs have much better async compute compared to their Nvidia counterparts.
Personally I have no interest in any fake frame bullshit that fricks with latency, but FSR makes more sense since it works on the GPUs that SHOULD need to use upscaling, I'm not buying a brand new $600+ GPU to use fricking upscaling, that should be reserved for people on Vega 64s and GTX 1080s who will actually need upscaling to get decent performance in new games.
I can't even imagine using something like frame generation with a 30 fps base frame rate
hell, even if it's higher you still shouldn't use it unless you're desperate for visual smoothness and it's a relaxed single player game
Framegen is just overhyped and only good for some games, it's no wonder people were shitting on it when Nvidia is selling it as a replacement for actual performance/VRAM improvements.
It's funny because before this review, German df guy said how devs are proud of what they did. They rewrite unreal material system to eliminate shader compilation and they didn't use super resolution because it was more costly than fsr
The lower the res the better. 720p is still way too much. Better to have better graphics or framerate than more pixels. Anyone saying different eg digital foundry doesn't know what they are talking about and are very stupid.
Like I said: stupid. Resolutuin increase kills quality. It forces artists to do less. They are left with many times less performance to fill with art. Games cease to be impressive, generations look the same. It is not an arguable point, it is a fact.
>Like I said: stupid. Resolutuin increase kills quality. It forces artists to do less. They are left with many times less performance to fill with art. Games cease to be impressive, generations look the same. It is not an arguable point, it is a fact.
They only look the same if you are for example looking at 4k footage on a 22" 1080p screen instead of going to a proper large 4k display...
dropped UE5 when it was struggling to run a simple test map with like 3 characters and a bunch of simple shaped meshes. there wasn't even some complex script running in the background. it's 90% the lighting, when I turned it off, it ran fine.
> dropped UE5 when it was struggling to run a simple test map with like 3 characters and a bunch of simple shaped meshes
The performance loss from ue4 to ue5 is wild , I beardy get 20fps on the starter map, and anything remotely complex is like 17 ish fps. My laptop is crap but on ue4 it ran a lot better.
Unity runs fine on my laptop though.
There is nothing wrong with making low res even more playable. Nvidia should make it a forceable option upon all games. Same with AMD's FSR. It would allow people to choose what they prefer:
Graphics
Resolution
or
Wasting performance on pixels. For pedophiles whonhave been victimised by parasites like digital foundry.
Low res as standard means less downgrades. It is that simple. Playable bullshots. There is no way that is bad. People whinging about it are invalid fools.
Blame PC users who's first realisation with tweaking is that lowering the screen res comes first. Nvidia used to block the showing of supersampling because it was a waste of performance. Maybe idiot proofing wasn't so bad after all.
PS4 was all about checkerboard rendering and dynamic resolution. So DLSS was just nvidia exclusive feature. We lost the true resolution in that generation
Look at retro poorgays get excited about about 4k images resized to 720p for comparison with actual 720p images. You can't compare a 3840 x 2160 to a 1280 x 720 image by shrinking the 4k image to 1280x720 and putting them side by side, smoothbrains.
I have tried to use 1080 desktop wallpaper on 4k screens but it looked horrible. Of course the 4k screen shrunk to 1080 will look fine on old, small screens.
people fundamentally misunderstand resolution
resolution is just a means of getting ppi, that's all
you can have a resolution that looks excellent at a given size and a resolution with 1/4 of the pixel count that looks excellent at a much smaller size
yeah your desktop wallpaper looks like shit on your (presumably) large desktop 4K monitor
Now try it on a laptop.
And do keep in mind that scaling is an issue as well. Even if you have a display with the perfect size, a resolution may not scale well to another - for example, raw 720p will always look like shit on a 1080p screen because it doesn't scale properly
A lot of people have the wrong idea about certain resolutions >le 720p looks like shit!1!!
Yeah yeah, but have you tried a small 24 in 720p TV with appropriate distance? It's excellent >le 1080p looks like shit!1!!
Yeah yeah, but have you tried using a laptop?
just some examples
>nanite >loads every single asset into memory at high mesh resolution but it does its magic to render it in real time
Im not an expert but this cant be good right?
>Shadows don't work because the trees don't face the sun
lmao. How do you frick this up? The billboards should face whatever rasterization angle is active.
DLSS runs at 66%, 58%, 50%, or 33% of your output resolution depending on whether you use Quality, Balanced, Performance, or Ultra Performance mode. And if you're someone with a 2160p display, but a GPU so weak that you need to run it at Ultra Performance mode, then you have made a moronic decision and misspent your money
It happened because there were a few years of overpowered graphic hardware, so merchants tried to sell goim 4k and 120, and they were almost reachable when real-time raytracing became a thing.
This console gen is officially worth skipping. I'm just gonna keep playing on my PS4 and switch and get on the next gen. Surelly by then we'll get an actual improvement.
At first I thought I was upgrade to the steam deck 2 but honestly these handhelds even after 10 years will not run games like this properly at 1080p 60fps. They aren't getting rtx 4070 performance in a handheld even after 10 years.
I'm probably just gonna replace my deck with the legion go and only upgrade when there is a better tablet hybrid handheld.
>its tim's fault that his company developed a feature that makes games run like shit and then marketed it to developers as a major feature of their latest product
yes
BLOODY BENCHOD BASTERD DELETED THIS THRED RIGHT NOW
EPIC GEME GOOD
EPIC GEME STORE GOOD
UNREL ENGENE GOOD
TIM-SAMA SUPERPOOWER STEME BAD DESTROYER
When are people going to stop calling them " next-gen "?
When people actually own them en mass
Nah current gen is always "next gen" until people start speculating about the actual next gen at which point they become current gen.
When PS6 comes out
When they'll start getting games? This doesn't even feel like a new generation.
Most new games come out on both old gen and current gen. PS5 has no games. Xbox has no games.
PS5 feels like a more fleshed out version of PS4 Pro at this point, same thing with Xbox Series (they even have the same UI), they don't feel like new consoles.
This is one of them
"Next-gen" is just an excuse not to optimize the game. You don't have to worry about a PS4, so now the baseline is the PS5 and you can push out these unoptimized turds that barely look any better (sometimes worse) than games released years ago and run like absolute shit
Gotham Knights, Redfall, Forspoken, FF XVI, Remnant 2, Immortals of Aveum, the list goes on
Maybe snoy first party devs will actually optimize their games, but third parties are just spamming next gen slop after next gen slop
They never will feel like new consoles again. We will never go from super Nintendo to Nintendo 64 again. Now it’s just a whole lot of squinting and digital foundry reviews to convince me that yes that shadow does look better than it did before
Modern consoles are just PC hardware with custom OS. PC won.
When I can walk into my local store in america and actually see them and not a placeholder sign telling me to order online. Switch is in stock by the bunches right now, PS4 and xbox one were in stock to sit on the shelf for someone walking by to purchase when it was this long after their release date.
there hasn't been a stock shortage on Series S/X or PS5 in months
you CAN walk into a store and buy one off a shelf
Because the generation still hasn't gotten fully off the ground even nearly three years in.
Wrong Fortnite runs at 4k or 1440p 120hz
>First party UE5 game runs well
Crazy.
Holy what the frick, I said in another thread that it's 720p as a joke, I didn't think it was actually 720 upscaled LOL
Makes me wonder now how PCs are running this at the same settings.
No it doesn't you dumbfrick. To hit 60fps, the PS5 need to res scale Fortnite heavily too. IIRC it's basically a little bit more than 1080p upscaled to 4K with TSR. It's barely better than this.
>It's barely better than this.
nah, it's much better than this. immortals doesn't even come close to a constant 60 fps in gameplay
Yeah fair enough, but it is just Fortnite, and the consoles are still struggling.
On console this game looks like an early gen PS4 game.
Impressive. How does Unreal do it?
>720p upscale to 4K (a resolution 9 times higher)
wtf I thought that was just a meme??
This actually happens now? They just use FSR ultra performance and call it a day?
Yes. We haven't moved on from 720p from the Xbox 360 era. Devs are re-embracing it.
Well, there was a video with Control upscaled from a ridiculously low resolution like 120p with DLSS and it worked some time
PS5 and XSX are not next-gen anymore and haven't been since 2020.
This has been debunked, actually.
A console is only current gen when the previous one has been completely dropped by game devs.
In 2021 and 2022 virtually all major games came out for PS4. In 2023 (and next year) there are less, but still plenty - Street Fighter VI, Atomic Heart, Diablo IV, Resident Evil 4, Dead Island 2, CoD MWIII, etc.
2023 and 2024 are the last years of relevance for the PS4 and the first big years for the PS5. PS5 should only be current gen in 2025.
Console transitions are longer and longer
>A console is only current gen when the previous one has been completely dropped by game devs.
By that logic all of these iPhones are current-gen.
maybe they are, maybe they aren't
phones aren't consoles, you don't buy a new console every year
Jesus that is better than android lol, my phone already got dropped support
In terms of gaming currently the iphone 11 with the a13 bionic is consideres the minimum for newer games like the new assasins creed jade game. And for android phones it is ones with an 865 or better.
>1080p is outdated
>what about upscaling to 4k from 720p
Like I said before, the free GPU horsepower could be utilized on higher resolution or rtx but not for both, but golems fell for 2 memes simultaneously
>PS2 - 480i
>PS3 - 720p
>PS4 - 1080p
>PS5 - 720p (upscaled to 4k)
Progress
1080p is the endgame
A shame 1440p gets overlooked. I found it a nice upgrade from 1080 but without raping frames like 4k
No, 1440p doesnt fit the dimensions of regular peoples monitors nor does it fit 4K to 1080p nicely. Its pretty much an abomination
>Here is your $1,500 video card
You do realize you're supposed to use DLSS right? It's right there in the system requirements.
This is a revolutionary cutting edge next-gen game. It wasn't meant to run on your gtx 1050 ti poorgay
>You do realize you're supposed to use DLSS right?
Seething pc fat
>he drank the dlss cool aid
DLSS performance at 1080p is 960x540. At some point it makes no sense when RTX 3080s need to run at Sony PSvita handheld resolution to get 60fps.
it looks better than native 1080p THOUGH
Did you know that all modern games scale their lighting, textures, and more based on resolution. So DLSS = lower setting, even if you're playing on ultra.
Sounds like an optimization issue, not a DLSS issue
If there is an optimization issue then DLSS is there and that's the issue in itself. You can't get real ultra setting as long as games rely on DLSS.
DLSS performance at 1440p would look a lot better than this on consoles. FSR2 looks horrible at lower resolutions, DLSS does a much better job.
>you're supposed to use DLSS right?
No, its a crutch modern developers have to use to get above 60 with the UE5 slop graphics.
>You do realize you're supposed to use DLSS right?
So you are saying you need upscaling and fake frames on your $3k gaming rig?
>Buying a high end graphics card to use upscaling hacks
You are the problem.
DLSS is legitimately a fricking handicap on development teams to give them an excuse to not optimize their fricking games. Now the onus is on the consumer to just "have the latest hardware bro" so you can get 40fps out of max settings on the most expensive card on the market. Frick you and frick lazy ass developers. If a game requires DLSS to achieve a decent frame rate (see 60 or above), I won't be playing it.
>You do realize you're supposed to use DLSS right?
I think this chart numbers have been taken with DLSS on. The game is too damn taxing and definitely not worth your time or your money
>You do realize you're supposed to use DLSS right?
No game should need upscaling tech to run at resolutions that were meant to be the native norm 10+ years ago.
dlss is cancer for videogame industry
it's a lot better than FSR2 lol
same shit you know what he means
devs got this huge time saving tool that allows them to skip optimizing their games and just slap dlss/fsr and say that it runs well
7900 XTX is only 900$
AMD always wins baby
> only
FSR2 looks pretty bad in this game though, so many spells and FSR2 breaks in motion/breaks easily with effects.
I don't know why the devs didn't use the little bit better TSR here on consoles.
I got a 3070 and completely gave up trying to get the latest tech when I saw it couldn't even maintain 144fps 1440p on apex legends. I thought that would have been a decent graphics card to jump to from console to truly feel like the master race, but no, it struggles with a ton of games. And I feel even more annoyed that dlss is carrying everything. Massive price spikes but not proportional performance unless using AI crutch. So was my graphics card actually garbage and all these devs just use AI to mask the weakness?
Back to the point - I'm not upgrading my rig unless I need to for work or some shit. If things break, I'm just going to stop playing vidya. I ain't spending any money on graphics cards again. That's how bitter I feel about this lol
>4070 is a 1440p ca-ACK
>RX 6800 XT outperforming a fricking 3090
nvidiBlack folk owned
This is why nvidia spends tens of millions forcing developers to use their gameworks and ray tracing implementation in their games.
3090 beat by a 6800 lmao
>436p on the series S
jesus christ I feel bad for people who actually try to play games on that console
It's a good console if you don't care for muh graphics goyslop games
>it's a good console if you only play 2d platformers
at that point just get a used xbone/ps4 and save your money
Why???
SeS has all the next gen featured such as Quick Resume for example, and all older games from older generations are running much better.
Most of the modern games run decent on it, sure, there's stuff like Baldur Gate 3, but it's an extremely small proportion.
Realistically it's not any worse than PS5/XSX
436p on a 1080p TV (what you're supposed to use with a Series S) looks dreadful, but proportionally 720p on a 4K TV looks pretty disgusting too.
It's retro-like.
>Unreal Engine 5
So what will games look like in 5 years?
480p PS5/XSX, 360p Series S?
the game itself is a piece of shit even on high end PC's, 4090/7900 xtx and barely 40 fps on 4k, that's a joke for a gpu over 1000 $
> pervious gen flagships can't get 60fps at 1440p
My god.
>your brand new 4k card is now your 1440p card
>your brand new 1440p card is now your 1080p card
hopefully you didn't waste any money on an expensive monitor Ganker
>your brand new 1080o card is your new 720p card
Hehe
the absolute state of western vidya
Is this with or without raytracing? As soon as I turn off RT in Star Wars my FPS skyrockets.
Considering that RGB blobs of light don't seem to illuminate area in RGB I would say without
There's no raytracing in the game. It would probably be 360p on consoles if they tried it.
Basically yes it use UE5's own version RT (to keep things simple) and there's no ways to turn it off even on PC.
It's all raytraced technically.
I was playing Remnant 2, looks like utter dogshit fyi and runs like crap.
>UE5 runs
UE5 runs fine, devs are being fricking moronic and putting unoptimized meshes with like 8 million polygons each everywhere and expecting it to just work.
The game is essentially running in FS2 Ultra Performance mode. What a joke.
>2023
>the latest games on the latest consoles run at 720p 44 fps
There's nothing wrong with the consoles
A PS5 had much better hardware for 2020 than the PS4 had for 2013.
Developers nowadays are just lazy and/or incompetent.
Ugliest game I've seen, shit looks like a marvel movie
Wow, look at those cool white particle effects! Upscaling artifacts? Wrong, it looks just as good as native. These are just magic particles that randomly appear when you're moving the camera, very important part of the game's story.
Can someone make a webm of how blurry it gets when you change weapon in game
>EA puts out a bad game
>People blame the engine
>"Realistic" games look soulless and manufactured
>"Realistic" games are poorly optimized, can barely run on cutting edge PCs
>"Realistic" games bankrupt developers, encourage more sequels and mediocrity
>"Realistic" games take up 100GB+ of space in 2023
>"Realistic" games cost more to make up the difference, $70 + microtransactions
Why do we accept this?.
Are lazy/incompetent devs the issue or is it Unreal Engine 5? UE5 is pumping out this slop that looks like your average PS4/XBone game but barely runs on a PS5/Series X? It makes no sense, at that point just use a different engine.
It's both
UE5 without Lumen and Nanite should be as fast as UE4, but the whole point of UE5 is all the work they did to implement this perfect geometry and lightning system.
But it's a meme, way too ambitious for today's hardware.
Even if we had the hardware for it (and we don't not even a 4090 can do 4k/60 in this game), I genuinely don't think it's worth it. The lighting/geometry provided by lumen and nanite cause a massive performance hit with very little benefit. Even if the hardware could run it you'd be better off using its power on something else.
I'm gonna be honest here, I'm getting a bit nervous that we're going to see a ton of slop that barely runs on consoles in the next few years - despite looking no better than PS4 games.
And Series S is kill if this is the road we go down too.
>UE5 without Lumen and Nanite should be as fast as UE4
It doesn't, UE5 without the new features is slower then the last version of UE4, that doesn't factor in stuff you can't opt out of like the new Chaos physics that runs worse then the old physx did.
Not him but all that means is UE5 was pushed out too early just like what happened with UE3.
UE4 when it launched was also a hot mess, it wasn't until somewhere around 4.5-4.8 was when the community settled on UE4 being "production"
ready. We are just repeating the same mistake again.
Performance is worse and on top of that most of the new tools are in shit shape with terrible documentation and riddled with bugs, trying to work with the engine as it is now makes me want to jump off bridge.
UE5 and FSR gave them enough reasons to be lazy and not optimize their shit anymore.
You know there will be enough morons defending this shit. Welcome to the next gen.
lumen and nanite look great in fortnite, well worth the fps cost but other UE5 games I've seen look so shit I don't see the benefit
I thought UE5 was supposed to have this super cool revolutionary rendering system that effortlessly juggles a trillion polygons simultaneously and requires no optimization, yet somehow games made with it just rape all hardware
>yet somehow games made with it just rape all hardware
and none of them look anywhere near as good as the tech demos
those that do I imagine will be running sub-480p on PS5 and Series X
Series S will run at 120p kek
The crazy thing is, games that could run on PS4, with better resolution and performance, looked better than this. A literal downgrade with next-gen is a new low for gaming.
>The crazy thing is, games that could run on PS4, with better resolution and performance, looked better than this. A literal downgrade with next-gen is a new low for gaming.
That's the best way I've seen it put lmao, not only are console-sisters no longer getting mind blowing upgrades with each generation, they're actually getting downgrades.
> looked better than this.
No not really, the visuals never really came close even on read dead 2
You don't think Red Dead 2 looks better than this game? I strongly disagree.
No especially the lighting geometry and the amount of things actually going on in the map. Maxed out on a 4090 and the game looks pretty impressive, too bad it barely runs on most systems due to ue5 being too taxing. In a sense games can’t really look any better due to the performance bump not being high enough for the increased resolution people want 1440p and 4k. GPUs really need to get better. And 1080p should have been the standard for a bit longer.
Just because there's more things going on doesn't mean it looks better. The combat in particular looks like a complete fricking mess.
Have you seen console footage for Immortals of Aveum?
> Have you seen console footage for Immortals of Aveum?
Yea, 720p is bad but even so. Looks more impressive
doesn't look any better to me
Still think RDR 2 looks better, even on that screenshot that heavily favours IoA.
How specifically are we defining "looks better" here? Cause I could name a dozen SNES games that, aesthetically, "look better" than Avum. But even from those tiny screenshots I can see RDR2's textures and lighting not measuring up to Aveum
By aesthetics of course. If you're pushing the hardware to such a point that you're getting 720p on the premium consoles, you damn well better be doing something special with it, or else you're just a laughing stock.
>Phoneposter trying to judge image quality
>MMXXIII AD
>still unbeatable
This is their way of forcing you to buy mid-gen upgrades
How so? Even a 4090 struggles with this shit and there's no way a mid-gen upgrade will be better than that.
Besides
>buy better hardware
>devs optimize their games even less
>still playing total slop
FSR3 can't come fast enough.
Not going to solve shit on consoles
the response to FSR3 is so bizarre to me
when DLSS3 came out a lot of people discredited it because there's a latency penalty and at low FPS it looks and feels bad, now I've heard people saying they can't wait to use FSR3 on their steam deck or consoles, systems that end up running games closer to 30 fps than 60 fps
FSR3 is open source, that is all. It's not limited to one hardware manufacturer.
open source isn't a silver bullet for anything, it doesn't matter if a software solution is open source if it's not an effective solution for a problem
FSR3 not magic, people hyping up FSR3 are misunderstanding how the technology works
for example FSR uses async compute which means that if a game also uses async compute FSR3 will perform worse than DLSS 3 in the same scenario since DLSS 3 uses optical flow accelerators to generate the fake frames
the other downside to using async compute is that old ass GPUS like pascal have terrible async compute compared to newer GPUs which is why FSR3 it won't work on cards like the 1060 or older because they have terrible async compute performance
There is also anti-lag+ only on RDNA3. Guess something in the HW of RDNA3 allows some latency reduction like Nvidia pulls off. I think FSR3 is going to be kinda unusable on non RDNA3 cards unless it's a turned based game or something, saying it works on everything is just marketing.
I always turn off anti lag. It kills my experience when it's on. Maybe it's just me though.
>the other downside to using async compute is that old ass GPUS like pascal have terrible async compute compared to newer GPUs
That's on Nvidia, older AMD GPUs have much better async compute compared to their Nvidia counterparts.
Personally I have no interest in any fake frame bullshit that fricks with latency, but FSR makes more sense since it works on the GPUs that SHOULD need to use upscaling, I'm not buying a brand new $600+ GPU to use fricking upscaling, that should be reserved for people on Vega 64s and GTX 1080s who will actually need upscaling to get decent performance in new games.
I can't even imagine using something like frame generation with a 30 fps base frame rate
hell, even if it's higher you still shouldn't use it unless you're desperate for visual smoothness and it's a relaxed single player game
A lot of DLSS hate is just sour grapes tbh.
Framegen is just overhyped and only good for some games, it's no wonder people were shitting on it when Nvidia is selling it as a replacement for actual performance/VRAM improvements.
>all the meme 4k 120fps marketing for a literal decade
>just for that
I wasted my life on a shit hobby.
game optimisation just keeps getting worse but for once, it's actually affecting consoles as well and that's hilarious
It's funny because before this review, German df guy said how devs are proud of what they did. They rewrite unreal material system to eliminate shader compilation and they didn't use super resolution because it was more costly than fsr
The lower the res the better. 720p is still way too much. Better to have better graphics or framerate than more pixels. Anyone saying different eg digital foundry doesn't know what they are talking about and are very stupid.
this is a moronic take that's not even worth explaining why it's moronic
Like I said: stupid. Resolutuin increase kills quality. It forces artists to do less. They are left with many times less performance to fill with art. Games cease to be impressive, generations look the same. It is not an arguable point, it is a fact.
>Like I said: stupid. Resolutuin increase kills quality. It forces artists to do less. They are left with many times less performance to fill with art. Games cease to be impressive, generations look the same. It is not an arguable point, it is a fact.
They only look the same if you are for example looking at 4k footage on a 22" 1080p screen instead of going to a proper large 4k display...
dropped UE5 when it was struggling to run a simple test map with like 3 characters and a bunch of simple shaped meshes. there wasn't even some complex script running in the background. it's 90% the lighting, when I turned it off, it ran fine.
> dropped UE5 when it was struggling to run a simple test map with like 3 characters and a bunch of simple shaped meshes
The performance loss from ue4 to ue5 is wild , I beardy get 20fps on the starter map, and anything remotely complex is like 17 ish fps. My laptop is crap but on ue4 it ran a lot better.
Unity runs fine on my laptop though.
Beat EPIC personel with a stick.
>720p upscaled to 2160p
The future of gaming, thanks israelitevidia
>not blaming Epic for spreading its cancerous engine
moron
I blame Nvidia for popularizing upscaling shit with DLSS.
There is nothing wrong with making low res even more playable. Nvidia should make it a forceable option upon all games. Same with AMD's FSR. It would allow people to choose what they prefer:
Graphics
Resolution
or
Wasting performance on pixels. For pedophiles whonhave been victimised by parasites like digital foundry.
Low res as standard means less downgrades. It is that simple. Playable bullshots. There is no way that is bad. People whinging about it are invalid fools.
you dont actually believe this you're just pretending to for attention
Blame PC users who's first realisation with tweaking is that lowering the screen res comes first. Nvidia used to block the showing of supersampling because it was a waste of performance. Maybe idiot proofing wasn't so bad after all.
PS4 was all about checkerboard rendering and dynamic resolution. So DLSS was just nvidia exclusive feature. We lost the true resolution in that generation
Imagine hiring Michael Kirkbride to write for such a turd of a game.
DLSS is good IF THE BASE GAME IS AT LEAST 4K 60FPS
Look at retro poorgays get excited about about 4k images resized to 720p for comparison with actual 720p images. You can't compare a 3840 x 2160 to a 1280 x 720 image by shrinking the 4k image to 1280x720 and putting them side by side, smoothbrains.
I have tried to use 1080 desktop wallpaper on 4k screens but it looked horrible. Of course the 4k screen shrunk to 1080 will look fine on old, small screens.
people fundamentally misunderstand resolution
resolution is just a means of getting ppi, that's all
you can have a resolution that looks excellent at a given size and a resolution with 1/4 of the pixel count that looks excellent at a much smaller size
yeah your desktop wallpaper looks like shit on your (presumably) large desktop 4K monitor
Now try it on a laptop.
And do keep in mind that scaling is an issue as well. Even if you have a display with the perfect size, a resolution may not scale well to another - for example, raw 720p will always look like shit on a 1080p screen because it doesn't scale properly
A lot of people have the wrong idea about certain resolutions
>le 720p looks like shit!1!!
Yeah yeah, but have you tried a small 24 in 720p TV with appropriate distance? It's excellent
>le 1080p looks like shit!1!!
Yeah yeah, but have you tried using a laptop?
just some examples
I've been waiting for FF7 Rebirth to get a PS5. At this stage does it seem like I might as well wait for the PS5 Pro?
>nanite
>loads every single asset into memory at high mesh resolution but it does its magic to render it in real time
Im not an expert but this cant be good right?
No, when you use SSD for cache for example
No, it's a huge performance loss for a small aesthetics gain.
>small aesthetic gain
?si=1ZC5MMDZNxuAs6Lo&t=100
>Shadows don't work because the trees don't face the sun
lmao. How do you frick this up? The billboards should face whatever rasterization angle is active.
So does DLSS on PC.
DLSS runs at 66%, 58%, 50%, or 33% of your output resolution depending on whether you use Quality, Balanced, Performance, or Ultra Performance mode. And if you're someone with a 2160p display, but a GPU so weak that you need to run it at Ultra Performance mode, then you have made a moronic decision and misspent your money
It depends on the number of tenga cores.
tenga my anus, moron
Its crazy how graphics went from being cutting edge to being legit rushed unoptimized bloat all thanks to the console market becoming bloated budget
they forgot to check the optimize box
It happened because there were a few years of overpowered graphic hardware, so merchants tried to sell goim 4k and 120, and they were almost reachable when real-time raytracing became a thing.
>It happened because there were a few years of overpowered graphic hardware
nonsense. there was never "overpowered graphic hardware" and it has everything to do with lowest common denominator.
Do games devs just not care anymore? Seems like there is lack of passion in the industry.
This console gen is officially worth skipping. I'm just gonna keep playing on my PS4 and switch and get on the next gen. Surelly by then we'll get an actual improvement.
>on next-gen consoles
You mean current-gen? Those things are getting old, dude.
At first I thought I was upgrade to the steam deck 2 but honestly these handhelds even after 10 years will not run games like this properly at 1080p 60fps. They aren't getting rtx 4070 performance in a handheld even after 10 years.
I'm probably just gonna replace my deck with the legion go and only upgrade when there is a better tablet hybrid handheld.
>Its Tim's fault EA released a game that performs like shit because it has 80 gorrilion polygons on screen at any given time
>because it has 80 gorrilion polygons on screen at any given time
Anon, that is literally the entire point of Nanite.
>its tim's fault that his company developed a feature that makes games run like shit and then marketed it to developers as a major feature of their latest product
yes