Is your pc ready for unreal engine 5 games?
It's All Fucked Shirt $22.14 |
Thalidomide Vintage Ad Shirt $22.14 |
It's All Fucked Shirt $22.14 |
Is your pc ready for unreal engine 5 games?
It's All Fucked Shirt $22.14 |
Thalidomide Vintage Ad Shirt $22.14 |
It's All Fucked Shirt $22.14 |
sure, my RX 6600 is cinematic enough it seems
DLSS really is cancer.
moron.
Coping Nvidiot who spent $2k to play at 540p.
Unreal Engine 5 uses TAAU on Consoles/AMD. Nvidia just has the best option with DLSS.
They're all shit, I don't care about what consoles do and I don't play any of this slop. You do though, apparently, which is why you're so mad about it.
all future games will have upscaling and you WILL accept it and you WILL enjoy it
>Spends 2 grand to play at 1080p
Mock and laugh at these fools
I have a 4090, so yes.
>7900 XTX already murdering the 4080 and catching up to the 4090
this is 1% lows, the 4080 is better on average
Lows are the most important metric. Sorry that you paid more for an inferior product.
>the 4080 is better on average
>very high
>Managed to snag a 7900 XT for $700
I was questioning it at first, but now I'm pretty confident that was a good deal.
Fake Frames
>immortal of gaygeum
I think the unreal 5 stuff looks bland so I'm not really interested in any of it.
The lighting black magic they keep pushing also kinda already went past what's visual appealing and it's start to look silly.
>tf 3070
don't know if i'll bother upgrading since i mostly just play indie games since most of the triple A games look like total shit.
This game doesn't look as good for the way it runs. I can choose to use a 4090 to run the game with pathtracing on Cyberpunk for effectively a little less FPS or play this.
>This game doesn't look as good for the way it runs.
This can be said for pretty much every single game since 2016~2018. Ray tracing, for example, which is the big technology ~~*they*~~ are pushing right now, is far more demanding on hardware and provide visuals which look identical to all the age old tricks we were using to simulate its look. It's all a ploy to sell more power-hungry expensive hardware which will be turned obsolete in 6 months by shoddly made software.
Take a game like Crysis (the 2007 original) and add a bit of GI, some PBR materials and it will look pretty much as good as most modern games, while still running far better. Innovation and well crafted software died long ago. Now we just have lazily made shit that runs like garbage and requires +300W GPUs and AI upscaling crutches to reach playable framerates all at the cost of no or very little visual improvement.
"Photorealism" in games was a mistake and it's all ~~*their*~~ fault.
At some point poor optimization is going to hurt sales. Just looking at steam hardware survey, most players couldn't even run this game at acceptable performance.
Most people play on consoles who are already used to sub 30 fps upscaling shit.
Most people who buy new games play on consoles. And this runs just fine on a PS5 or Series S/X.
It runs like shit actually.
It also looks absolutely terrible.
i would say it already does but then , i dont think most pèople gives a frick about playing most new games at max setting
Great
Another goyslop AAA troony fantasy game with vaseline TAA smeared over the screen with lighting so bad you'd think the brightness was set to max.
All the good games have already been made
it's not TAA this time, the game was designed around upscaling from 540p
This thread yet again.
REMINDER: you're supposed to use upscaling for optimal results.
>I'm not using the product the way it was intended to be used so I'm gonna cry on Ganker all day like the loser I am instead of getting a job
>2023
>the only way to get a playable framerate on a $600 gpu is to run the game at 540p and use an upscaler
>this is supposed to be a good thing
>buy 4090
>day one benchmarks show 50 game average of 4k 140fps without upscaling
>be happy I need upscaling from 1080p to get 80fps
Yeah m8, I'm the idiot
>not even 90fps at 1080p on a $2000 graphics card
Grim.
My computer is 10 years old. I'm currently on one that's more than 20 years old.
>tfw 4080
it's over
If the most powerful GPU you can buy on the planet can't even get 100fps then it's not the consumers fault. It's the devs being morons
>DLSS and goy tracing being gobbled up like it's the 2nd coming of christ is not the consoomer's fault
The fastest cards on the market couldn't even achieve 60fps in Crysis when it came out, let alone 100fps. Turd worlders have flooded PC gaming since then though, and scream from the rooftops about new AAA games being "unoptimized" if they don't run at 4K/120fps on their GTX 970.
Crysis was a tech demo and unique in its kind and genuinely looked great. It also had a fully fledged physics engine, something which modern games completely lack yet they still look like ass and run badly.
yes except when crysis released visual quality of the game went trough roof
meanwhile nowadays the game get uglier and uglier yet use more ressources
Black person your goyslop game looks worse than Crysis does while also running like shit
>dude this game is an graphical achievement on par with crysis!
How much are you getting paid to say this?
>The fastest cards on the market couldn't even achieve 60fps in Crysis when it came out, let alone 100fps
except Crysis looked apart from all the other games on the market at the time.
Immortals of Copium looks just like every other game released right now + all the ghosting and duplicate frames from the upscaling which smears the hud all over the screen because developers weren't competent enough to even make upscaling only act upon the 3D rendered viewport, not the 2D hud.
Wait what the frick? This entire post is a complete and utter lie. An 8800 was nowhere near the best card on the market when Crysis came out.
What card was better than an 8800 GTX in 2007 when Crysis came out. There was the 8800 ultra I guess. But the 9000 series wasn't out until the next year. And I'm pretty sure that ATI didn't have a card better than the 8800 ultra in November 2007.
This. Also do they forget that in the 90s your CPU and 3d accelerator was obsolete in 1-2 years. And you needed to double your ram almost every year. You paid 2700 dollars (about 5k in now) for a Computer in late 1997. And before it's 3 year warranty was over, you needed to double it's ram just to play Sims 1 that was launched in early 2000.
This thread is full of shills revising history to try and convince people that dumping money into this shit is perfectly normal holy frick lmao
a pc in 1999 for me was a month salary
now a pc its 2 months salary
not everyone lives in the same place
Just post the regular FPS graph you fricking Black person, it's a shit graph anyway "very high" preset graphic options probably include a load of frame-eating options that's make 1% difference to the visuals like usual.
>1080p
Those numbers are still pathetic, anon. The game doesn't even look good.
>still needs 3080 to reach 60fps
>all for some shitty middling graphics
frick you, frick amd, frick nvidia, frick developers of such garbage, frick unreal engine 5, frick epic games, frick israelites, frick Black folk, frick jannies, frick woman.
I can run the Unreal Engine 5 editor on my GTX 1070 and i7 without issues. What's going on here?
Goyslop Engine was a mistake
>Have to get the absolute highest two tiers of cards to even run 1080p at/and above 60FPS
What year is this?
>needing more than a 4070 to achieve 60fps in 1080p
what the baziga is this le shit
Fort Solis, another UE5 game with Nanite and Lumen
absolute cinematic experience
>30FPS on my 3080
Wow just go gargle a bag of dicks. Games don't even look noticably better.
Just buy the 50xx series next year which will be ready for UE5!
Who asked? That vatnik site's benchmarks are also completely fake, and have been proven as such multiple times.
seems to align with my RTX 4070 results
>ukrainian/pole/other butthurt belt schizo seethes about russians in thread about gpus on Ganker
you cant even make ai porn with amd though
>mfw I'm on R5 2600x + 3060ti and a new shitty aaa slop tries to get my money
>Nanite will be powered by the PS5's SSD!
Reality
>Nanite lets devs not bother with LOD optimization and requires more powerful GPU's
Kek
Lumen is crap too, though i guess for consoles/AMD it's probably the best option. Path tracing & DLSS3.5 looks and performs better.
My rx6600 is good enough to play actual good games, I don't care about Debra Wilson slop.
Where does my 1660 Ti rank
>IoA
Anon that game would run terrible on whatever NASA computer.
Yes, you don't need more than 30 fps for gaming.
Fake Prices
Fake Resolution
Fake Frames
Fake Rays
The way it's meant to be played 🙂
>The way you're* meant to be played 🙂
Looks like shit, and of course has Black folk.
Boo-hoo Mr doom and gloom
Now show it with fsr/dlss on. Nothing burger.
>new games that nobody gives a shit about are unplayable on all hardware
Retro gamers win again.
You should've posted 4K numbers for maximum (You) potential.
I have a 4090, so I guess? Performance still looks like total crap, luckily this game also looks like total crap so in this instance I don't care at all.
>My 3070 that I paid 900 dollars for is already a completely outdated dead piece of trash
I love pc gaming.
That's on you for being a moron and paying those insane bloated prices instead of waiting.
3070s are going for as little as $370 new nowadays and even comparable GPUs like the 6700 XT go for sub-$340.
i couldn't help it myself since i was using a 750ti
nta , so i was forced to update before prices fully went normal paid 530€ for a 3060ti
I used a 460 768mb until late 2022. You could have waited.
sure but i saw it like this
>12x9=108/600=0.18 monthly
You thought an 8gb vram card could last you 9 years? Come on man. 8gb vram cards for consumer market debuted 9 years ago with the R9 290X 8gb.
>3070 8gb
>$900
Remnant 2 uses UE5 and requires DLSS to run well, lmao. Still runs above 60 on most of the XX80 series cards, though.
do people with expensive as frick gpus even play games? it seems all you fricks do is benchmark tech demo "games" and discuss these figures that most people never notice
what good games coming out in ue5?
Yeah, I'll just put everything on medium settings and play everything at 60 fps. Assuming anything worth playing is gonna use this engine. LotF looks kinda cool I guess.
It'd be something if these bloated bullshit games looked decent at least, but something like death stranding looks way better with 10x better performance
Now post the stats with intels newest drivers
surely this must be some moronic 4k benchm-
>1080p
what the shit
>5700XT
>Not even 120 fps
What the frick is wrong with modern games?
it isn't
there hasn't been a single game that made me think "i should update to play it" in the last 15 years
I played remnant 2 on a 5700XT with FSR enabled, and I wouldn't have known it was a UE5 game but for the frame rate.
I stopped playing games using unreal engine a while ago.
>posting min FPS graphs for a garbage EA game
great job moron
rx7600 here. cinematic 30 fps. the true console experience. i think i am ready
>4080/7900XT to hit 60fps@1440p
This is what happens when you put Telltale devs in charge of optimization or rather making a game that isn't a fricking dialogue similator.
>search title
>see this as third image
Fricking lmao not playing your shit game, EA game should have been a redflag anyways
t.4090 patrician
If you're gonna discriminate on game quality what games did you buy a 4090 for that you couldn't easily ruin well on a lesser card?
I got it to have a nice blend between AI, 3D modelling with physics simmulations and vidya without needing to spend 10k usd for just faster rendering.
This, the 4090 isn't a bad purchase if you use it mainly for business if you bought it for gaming you're a fricking moron.
Debating whether I should wait till next year to upgrade but there's buzz that business interest in AI is gonna frick prices even worse. Valid fear or just hype?
>looks like shit
>runs like shit
>plays like shit
Is this the future of video games?
I have a 7600 mobile
>tfw bought a 6750 XT last year
Damn all of Unreal Engine 5 games we've seen so far suffer from the same shitty unoptimized garbage. I'm scared for Wukong.
You could've bought a PS5/XSX and not deal with this crap.
>Immortals of Aveum
I think it says a lot that I don't even know what game this is
Your GPU is ready, but your CPU isnt.
You WILL experience microstutter
You WILL enjoy it.
>minimum FPS
literally useless metric considering every PC game has compilation stutter now which skews it too much from average FPS
>7900XT
Now I have to wait for games, any day now. There aren't any good games requiring all this hardware.