my 2060 is still viable. if i had bought a 1650 super or something, or some other card that was slightly more powerful but didnt have dlss i wouldnt be able to play
>or some other card that was slightly more powerful but didnt have dlss
Doesn't exist on the Nvidia side.
On the AMD side that would be an RX 6600 or RX 5700, both of which have aged far better than the 2060 did.
I forget the name of the principle, but essentially it goes that the amount of effort/power/cost to render something remains constant or goes up steadily because studios and devs use better hardware as an excuse to continue pushing for increased fidelity at the cost of performance, rather than staying in one spot and focusing on optimization. It will never change either because shiny visuals are an immediate in your face marketing asset.
If a game can't run on my 1080 Ti, I just won't play it. The 1080 Ti was the last real GPU without all this extra raytracing/upscaling crap tacked on to try and sell new hardware.
Frick RTX, frick DLSS, frick FSR, frick literally all these meme technologies that do nothing but make you buy better and better hardware for worse and worse looking games.
This. They're less than $200 on eBay now. An absolute bargain for a GPU that can run literally anything that doesn't push memetech.
Greatest GPU ever made.
Another one to the pile of "this will benefit the consumer and definitely not go right into the pockets of executives" I guess.
If a game can't run on my 1080 Ti, I just won't play it. The 1080 Ti was the last real GPU without all this extra raytracing/upscaling crap tacked on to try and sell new hardware.
Frick RTX, frick DLSS, frick FSR, frick literally all these meme technologies that do nothing but make you buy better and better hardware for worse and worse looking games.
1080ti is the most based card to exist I loved that little bastard.
It won't be /required/, just bump those requirements up a bit if you don't want to use it.
Upscaling has gotten good enough that in most situations it makes perfect sense to render a lower-res but otherwise prettier frame and then upres, vs using the same total level of resources to render a sharper frame with worse lighting etc.
Input latency from upscaling is a big issue. I don't really care about worse internal resolution/lighting, but going from a 4070 to a 4070 ti/4080 or a 5800X3D to a 7800X3D is a significant cost to circumvent bad optimization and run games without it.
Unironically told Black folk this would happen, if quirky 2D slop indie games are asking and pleading for a GTX 980 equivalent and a i5 then AAAs are going to REVEL in this shit. They no longer even have to bare minimum try.
>Hire the worst h1b hello sars coders imaginable for the lowest fricking rates >Literally never have to optimize, or listen to "fanbase" complaints as long as they keep buying it >Just upscale it bro, just fill in the missing frames bro becomes standard >ADD zoomtwats who stare at a tiny screen all day don't and won't notice the smeary piece of slow shit when fortnite 2 and AsscreeSpihman comes out because their vidyogabes are all colorful jingly keys to them anyway
Frick no. Quantum Break was mediocre, Control was shit, and they replaced Alan Wake with a black female character. It's completely over. Modern Remedy sucks.
You must be braindead downie, control was total fricking garbage. That anon is correct, modern remedy or remshitdy as I like to refer to them are sellout shitters that make shit games for plebs like yourself.
Tell me how that last level was good in any way you shit eating homosexual?
8 months ago
Anonymous
>middling last 5 minutes of the game >"REE TOTAL FRICKING GARBAGE111!!"
You sound like some tribal Fromdrone or novidya homosexual that never even touched the game? Control was far from perfect, it has some faults among which the ending was a primary one and not getting to explore dimensions with the Projector another, but it was a cool game and a fun experience for the most part. You gays trying to wage war on it on mongolian basket weaving forums is just static background noise noone listens to.
8 months ago
Anonymous
Whats fun about killing the same lame enemies over and over with a shit gun? You work for remedy lmao its the only way youd write such trash defending one of the worst games released besides muh ray tracing...
8 months ago
Anonymous
8 months ago
Anonymous
>can only nitpick the last few minutes of a massive game to prove a point
>spamming the overpowered throw through the entire game while wandering through identical grey corridors >good
I trudged through that fricking turd to get to more Alan Wake story only for the game to completely demystify it by going "yeah it was just a haunted typewriter all along". The entire game is like that, modern SCP tier writing trapped in a game with the color pallet and ugly, uninspired design of a hospital ward. Even the combat is a slog of boring guns and abilities that get rendered totally pointless when everything is a slow tanky bullet sponge unless you relentlessly use the single only attack that does anything. Get some fricking dignity and stop praising such contemptible shitware so Remedy can get back to making games for people with actual standards.
>Big download >remedy makes a game that's also half-movie
to be expected >gay actor
which one? doesn't affect the gameplay qb's late-game combat is a fricking blast but it takes a while to get there, that's its biggest sin
that sucks, lots of people liked him from animorphs
i don't know how either of those points make the game itself mediocre though
8 months ago
Anonymous
If you know how much an idiot he is...
You wouldn't like seeing his face either.
You just can't stand him and ruins immersion
I prefer my games to not have real people and even voice actors are better when they are nobody.
>antivax in 2023
Go inhale some polio if you're so confident in your own immune system
Oh wait you can't because vaccines deleted it in the first world
8 months ago
Anonymous
My gf died while taking it frick off
Montichiari Italy she was 27 and healthy
8 months ago
Anonymous
>Go inhale some polio if you're so confident in your own immune system >Oh wait you can't because vaccines deleted it in the first world
schizo
8 months ago
Anonymous
>Go inhale some polio if you're so confident in your own immune system >Oh wait you can't because vaccines deleted it in the first world >schizo
Meant for
Polio was fake and gay, it was DDT poisoning. "Polio" disappeared around the time DDT was discontinued and banned. It's easier to go OOPS a virus out of NOWHERE happened than to pay out settlements and insurance claims for chemical company frickery.
8 months ago
Anonymous
Polio was fake and gay, it was DDT poisoning. "Polio" disappeared around the time DDT was discontinued and banned. It's easier to go OOPS a virus out of NOWHERE happened than to pay out settlements and insurance claims for chemical company frickery.
>Frick no. Quantum Break was mediocre, Control was shit, and they replaced Alan Wake with a black female character. It's completely over. Modern Remedy sucks.
Quantum Break > Control > Alan Wake
After how hard they pushed DLSS in Control I'd be amazed if AMD cards can even run it properly. Even Control on a fricking RTX 4090 at 4k with DLSS off only gets 70 fps.
No but it'll run on my PS5. Replayed AW1 and American Nightmare earlier this year, played Control and just finished the AWE expansion today. Incredibly hyped for this game. Pointless trying to discuss it on Ganker because >added a black woman >epic exclusive
But for me it will likely be GOTY.
Previews have all been incredibly positive and the small amount of footage I've allowed myself to watch has looked great. I have faith in Sam Lake and the rest of Remedy.
I'm very hyped too anon. I played Alan Wake for the first time since release on ps plus, and enjoyed it thoroughly, way more fun than I remember it being.
Encouraged me to play through Control+dlcs and quantum break (meh) and am eagerly anticipating next weekend
I'm really interested in the game, but I hope it has three things >good enemy variety >good weapon variety >interesting combat locations
These have been pain points for a lot of Remedy's games even if I do like them. Control was really disappointing with its enemies choices considering it feels inspired by SCP.
>Replayed AW1 and American Nightmare earlier this year, played Control and just finished the AWE expansion today.
I haven't even played Alan Wake 1, or even any remedy game, and I also think that it'll be GOTY, that's how special it looks, to draw in somebody like me who previously didn't care less...man that IGN gameplay review was a godsend
its not bad pc optimizaiton. ur card is almost 2x as powerful as the xbox series x gpu. it runs at 30fps. you run at 60fps. its not "bad pc optimizaiton"
the same is true for most of these games that are accused of having bad pc optimization. like starfield, it is a very demanding game on both pc and console, runs on 30 fps on series x.
this is not a new phenomena, gta 4 is an old example of a game that runs like shit on pc, but even worse on console (although pc version was broken on release but got patched well)
I can somewhat understand Starfield because of how much data it has to process and the unreliable engine but Remedy used to be incredibly good with their own tech in Max Payne 1 and 2.
fair enough, but are they still using an inhouse engine?
8 months ago
Anonymous
Yes, it's an updated version of the engine they used for Control. Remedy always build their tech in house. It'll be a sad day if/when they decide to jump to UE.
8 months ago
Anonymous
As far as I know they're still using their "Northlight" engine which was also used for Control and Quantum Break. Prior to that was the AWE for Alan Wake and MaxFX for both Max Payne games.
this happens every gen. snoy and microshit promises 60 fps. a couple of first gen games on the new console run at 60, cross gen games run at 60, remasters run at 60. about 2-3 years in virtually all games run at 30
its been this way since the PS2 gen. mgs2 looked amazing and ran at a smooth 60fps. mgs3 was capped at 30 but most of the time it was more like 20 fps
>I dunno what you're talking about, there is a 60fps mode on consoles.
Yeah by cranking down resolution and settings, the same fricking thing you can easily do on PC
>its not "bad pc optimizaiton" >3070 recommended for 1 0 8 0 P >DLSS performance ( 5 4 0 P) >its not "bad pc optimizaiton"
you're a cuck in real life too ?
its even worse on console. the game is either unoptimized in general, or it is just using very demanding graphical effects. very very few games run fine on console and have "bad pc optimization"
It's going to flop everywhere. It's been so long since the last game and it was Xbox 360 exclusive at release for 2 years before it came to PC and then like a decade after that for a shitty remaster on modern consoles. No one is starving for games at this point of the year either.
>pay 600$ for a 4070 to play at 60 fps at 1440p with fake frames aka DLSS >at 4k with gay tracing you need a $1000 GPU and dlss to achive 60 fps
so tired of this garbage industry
Almost as woke as Baldur's Gay 3 so it will do really well. homosexuals will fellate themselves. trannies will spontaneously regrow their penises so they can frick the frog thing. It will be a veritable utopia!
to get 60fps with a 3070 you need to be playing with DLSS performance at 1080p
So meaning literally at like 560p. 3070 needs to run this game at below 720p to get 60 frames. Damn I love gaming so much bros.. thank god THIS is where we're at.
I can somewhat understand Starfield because of how much data it has to process and the unreliable engine but Remedy used to be incredibly good with their own tech in Max Payne 1 and 2.
Sami and Remedy really went to shit. It was good while it lasted.
This is really disappointing. I wasn't expecting it to run on a toaster but 560p on a 3070 to get 60fps is just tragic.
as much as I want to play this day 1, I think I'll wait it out a bit and see if they patch it to run better, this seems insane.
On screen rain drops never make sense in any situation. Only makes sense on actual transparent glass or plastic surfaces your character is looking through. Same with lens flares that don't happen in human eyes, only in cameras.
RDR2 looks incredible because it has god tier art direction and lighting and attention to detail. Up close it doesn't really hold up in a way a next gen title does.
I'm not arguing against you, I think RDR2 remains the most beautiful game of all time, but if you get too close to anything the low res textures become pretty apparent.
This. I knew this was going to happen the first time I saw DLSS (Think it was FFXV for Windows and it was DLSS 1 that only worked at 4K resolution or higher)
"It's just for 4K" they said.
"It'll make cards last longer" they said.
It just makes devs get away with being lazy.
Remember, anyone who dislikes dlss or points out any flaws MUST be an AMD fanboy because why else would anyone speak bad about our lord and master, Nvidia!?
FSR sucks too. Upscaling shouldn't be needed to run games on any GPU from the last 3 gens. It's a nice idea for running modern games on really old GPUs, but to require it at all levels? Frick that.
>mandatory DLSS/FSR
This has to be a joke. The game doesn't even have good graphics. It looks way worse than say, RDR2, which came out FIVE YEARS AGO and ran on a base PS4.
Actually Digital Foundry made quite a lot of fun of Control for performance. They even developed their own "corridor of doom" stress test for a specific corridor in the game where the framerate gets inexplicably raped.
Exactly and you also know that it's HFR just from looking at it.
Meanwhile I wanted to google a screenshot for Immortals of Aveum as "this unremarkable shit runs like garbage" example and the game is already so forgotten that when I finished typing in "Immortals of" the suggestions were entirely for Immortals of Meluha from 2010 and I honestly don't even know what that is.
>morons ITT blaming upscaling
If DLSS and FSR didn't exist you'd simply have to play at the same lower resolutions but without upscaling.
You are crazy if you think developers would bother optimizing, especially on PC. The reason they're not doing it is simple: they don't have to take PS4/Xbone in mind anymore and just let PS5/nu-Xbox brute force the game, consequences be damned.
Developer take the path of least resistance. Optimising the game probably takes 3-4 months just going through the code to rewrite it more efficiently and optimise the models LoD to scrap a few frames there and there.
Now they just run it once with DLSS and if it’s not catastrophic they just release it.
Devs don't give a shit about PC, especially nowadays. It's all about console. No AAA dev is thinking "oh shit we have to design our game around DLSS!".
Games run the way they do now because devs don't have to make them run on a base PS4 or Xbone anymore. So they just get these turds running on a PS5/XSX at the bare minimum acceptable (for a console lol) quality and call it a day.
What's with all these horribly optimized games coming out recently?
Most of these games can basically double their player base if they optimized a little more, it doesn't even make sense financially.
>morons ITT blaming upscaling
If DLSS and FSR didn't exist you'd simply have to play at the same lower resolutions but without upscaling.
You are crazy if you think developers would bother optimizing, especially on PC. The reason they're not doing it is simple: they don't have to take PS4/Xbone in mind anymore and just let PS5/nu-Xbox brute force the game, consequences be damned.
You are seeing some games run at 720p internally on PS5/XSX. A lot of games in the 900p to 1080p range.
Before they'd target 720p-1080p for Xbone and PS4, and so the games would run at much higher res and fps on PS5/XSX. Now that the old consoles are not developed for anymore, those resolution targets move on to the PS5/XSX. No further optimizations needed. Especially since most games are developed with consoles in mind, devs just don't give a shit.
People don't want to talk about this but the hardware israelites do this on purpose to make mustardracegays buy new GPUs every year. That and new devs are so useless they don't know how to optimize games.
I love that more powerful hardware is easily accessible for the vast majority of people these days, but man do I miss the wizard tier shit that devs had to do to make older hardware really work. N64, the PS1, PS2, etc. Devs had to basically make a pact with demons to squeeze every last bit of performance out of a console's hardware. Now that RAM is measured in GB instead of KB/MB, and 8+ core GPUs are standard, it seems like devs just got fricking lazy.
Even former "tech wizards" are shitting the bed these days.
Just look at Croteam and their upcoming Talos Principle 2.
The fricking title screen image that just moves around a bit is a video file that's half a gig in size, the same as the first Serious Sam game, and the game runs like ass in comparison to what's been presented thanks to the usual UE5 issues.
And to top it all off, it's 100 fricking Gigabytes which is bigger than all of their previous games combined, including SS4 that was already considered obscene at 40 - all this for a goddamn puzzle game.
Blame the publishers, games are just released on the schedule in beta state. Devs clearly know the state of their games but they have to launch it anyway.
Even a month or two of extra polish for 100+ team is a lot of money, even if they hire pajeets.
I think it's a combination of being forced to adhere to crazy release schedules as well as they used to utilize their own in house engines basically. A lot of games used pre-made engines now, so they have to deal with the bloat inherent to the engine. My favorite example is how they managed to put Resident Evil 2 on the N64. Fricking magic
Blame the publishers, games are just released on the schedule in beta state. Devs clearly know the state of their games but they have to launch it anyway.
Even a month or two of extra polish for 100+ team is a lot of money, even if they hire pajeets.
>1080 60fps is more demanding than 1440 30fps >mfw I'm still on 1080p
Maybe true, but kek anyway. This bottleneck is great, otherwise they would be pushing 8k right now
Agreed.
I find it hilarious for AAA to push these specs (poor optimization of), a 3060 aka an almost 400 VC is barely fit for minimum requirements?
Also, wtf is the le justification for this bullshit? Is Alan wake streaming a whole city or million of npc at the time? Gameplaywise this is not more complex than fricking original re4
How can the PC mustard race cope with the current state of the industry? >My rig uses slightly less upscaling than your console
Isn't really that much of an own
Personally i couldnt give a lesser frick about the "current state of the industry" because im not goyim. PC is the best platform because it actually lets you access all the gaming greats while consoles barely give a shit about the libraries that made them what they are
For many years it was standard to boot from SSD and have games on HDD for the larger storage. When SSD prices fell some people had a boot drive SSD and a game drive SSD. These days NVMe is so cheap it's becoming more standard for boot drive and game drive. The limiting factor now is that motherboards typically only have 1-2 NVMe M.2 slots.
Sure, but 4TB and higher capacity SSDs are still overpriced so you can consider multiple smaller ones instead.
My motherboard has only two m.2 slots so I still have to use SATA for HDDs and SSDs for hoarding.
HDDs are still fine for all the smaller games with exceptions.
Shame there isn't many SSHDs being made, I liked my hybrid but the main downside is that it sometimes isn't even utilised properly.
>remedy fan >big alan wake fan >big survival horror fan >only person I know who was hyped for the game >not sure I can even run it at a stable 30fps >epic exclusive
Do they not care that their crazy graffix and moronic requirements lose them sales? would they rather have the prettiest game or the cash in my pocket?
it's less that they got a lot of money and more that they got to make the game whilst retaining ownership of the IP
people can hate on EGS, but it's undeniably a good publishing arangement for Remedy, especially after they lost the rights to Max Payne and Quantum Break with previous publishers
I know this is only timed exclusivity the same way as Control.
I don't hate EGS any more than I hate Steam. I just think vast majority of PC gaymers use steam exclusively and even those who use EGS will think twice about buying this game after recent UE5 game launches.
it's not timed exclusivity, Epic are the publisher for Alan Wake 2, whereas Remedy self published Control and had a timed exclusivity agreement with Epic
AW2 will never come to Steam
I know this is only timed exclusivity the same way as Control.
I don't hate EGS any more than I hate Steam. I just think vast majority of PC gaymers use steam exclusively and even those who use EGS will think twice about buying this game after recent UE5 game launches.
I forgot to mention, I know they are using their own engine but I was thinking more about high hardware requirements similar to UE5.
>just spend $500 for one game youre on the fence about, for a gpu someone else has had sex with. And then also replace most of your pc as well because of compatability issues
I don't get it why they never put QB on a decent sale though? They routinely give away Control with all the dlc for like 10 bucks, which is tbqh a pretty good deal, but a mid game like QB is still like 15 bucks on Xmas sales, like a decade later.
>Requirements account for the DLSS/FSR mode you've got to select to play
The gaming industry is so fricking far gone it's not even funny anymore, they're literally expecting me to drop 2k€ for a fricking GPU just to play this year's games.
How did they frick it up so bad? I can play RDR2 at 4k ultra settings with a 4070 and get mostly 60fps. It also looks way better. This is non justificable.
>60 fps with DLSS on >on a 4070
What madness is this? Why wopuld I want to turn on that crap? Why can't a very high end pc I bought last year natively run a game at 100+ fps anymore?
>upgrade or perish
Black person I have a fricking high end pc, there is barely anything left to upgrade.
>you guys wanted no more cross generation games so here you fricking go >you guys
A so you are that special snowflake who is sooooo not like everyone else.
It's easy: If a game is released for PS4 (even at shit quality) it will run on any reasonable gaming PC at 60fps+ with no upscaling crap.
Only buy games that release for last-gen consoles and you'll never have this problem.
that game is 11 years old and still has a modern look to it
really pathetic how little graphics have improved
in the past every year was a revolution, now there's barely any difference in 3 or 5 - and even something a decade old can hold up fine
I tried DLSS in Death Stranding for shits and giggles (because that game's actually optimized well) and I'll never use it again. Made everything more than 10ft away blurry as vaseline and I only went from getting 120fps to 135fps. Shit technology that only looks good in super compressed YouTube videos.
Not really, it's designed for high ppi displays.
If you have a ppi high enough for the viewing distance you're at, you can use it.
You can absolutely use DLSS Balanced on a typical 1080p laptop display. Even Performance on a small one (14 in).
you are wrong. for it to work well it needs a decently high amount of pixels to upscale from. on a 4K display DLSS Performance is upscaling from 1080p, so the internal render already looks pretty good. But when you only have a 1080p display it's having to upscale from ~800p or even 720p, which looks like shit. It has to work a lot harder to produce a decent looking image but it's always going to result in artefacts & ghosting. it's why so many people think DLSS is shit, because they're using it wrong.
It's all about ppi. Everything is proportional.
Yes it extrapolates less information at lower resolutions, but it also has to fill a lower resolution display. A 1080p panel at 15.6 in looks just like a 32 in 4K panel.
8 months ago
Anonymous
youre right about ppi but that isnt really what we're talking about here. Just because you have a high ppi display doesn't mean a 700MB YIFY movie is going to look good on it.
>alan wake game where you barely play as him >EGS exclusive >requires a high end PC to run it at 60FPS even though it probably looks like blurry TAA shit
DLSS is nice but it can't perform miracles. What is the point of all this fancy tech if you get blurry mess as end result. All this fine detail and hard work will get turned into mush.
What's the point of spending all this money to make a fancy looking game if: >On consoles it will run at 540p 30fps so no one will be able to appreciate the graphics anyway >Most PC gamers won't even be able to play it >The ones who can will choose not to do so because you took Timmy's EGS bribe
I do not understand.
Are (you) moronic? In what world is DLSS mandatory? It's a completely optional feature. Games would run the same way they are running now if DLSS didn't exist, the only difference is they'd look worse.
Protip: AAA games are designed around consoles, not PCs.
Those are meant to enhance graphics, not literally brute force dogshit you out of a proper native resolution as the screen seizures from blurry to clear
no, they're techniques to produce a better image without negatively impacting performance too much. pretty much every single aspect of raster graphics fits that bill, they're tricks and hacks. Upscaling is just another tool in that arsenal, why wouldn't devs use it?
You're a fricking moron, I hope you get raped to death.
Fricking ESL shills like you don't deserve to continue drawing breath on this earth.
8 months ago
Anonymous
aww poor baby cant listen to reason
8 months ago
Anonymous
>You're a fricking moron, I hope you get raped to death. >Fricking ESL shills like you don't deserve to continue drawing breath on this earth.
>Nobody is stopping you : )
Disingenuous fricking israelite.
If they STOP optimizing so you CAN'T run it native regardless of hardware because they can and WILL crutch on this. You literally have no choice.
>Disingenuous fricking israelite. >If they STOP optimizing so you CAN'T run it native regardless of hardware because they can and WILL crutch on this. You literally have no choice.
schizophrenia
Disingenuous fricking israelite.
If they STOP optimizing so you CAN'T run it native regardless of hardware because they can and WILL crutch on this. You literally have no choice.
8 months ago
Anonymous
>duuuude just optimise :^)
you have literally no idea what that even means
8 months ago
Anonymous
Neither do fricking developers after 2016 apparently.
8 months ago
Anonymous
AAA video games in 2015 look better than ANY AAA title today and they ran on hardware 10x less powerful with no upscaling.
8 months ago
Anonymous
post a screenshot of one example.
8 months ago
Anonymous
8 months ago
Anonymous
see here's where these arguments fall apart because this game uses fully baked lighting while most games today use fully dynamic lighting. You cant compare the two. The Last of Us Part 2 looks miles better than this, in part because it also makes heavy use of baked lighting.
8 months ago
Anonymous
>The Last of Us Part 2 looks miles better than this
AHAHHAHAHAHAHAHA
Dude, some of the environments in TLOU2 look barely better than a PS3 game.
8 months ago
Anonymous
I think that looks really good?
8 months ago
Anonymous
Then it goes without saying that you're an idiot whose opinions can be ignored.
What benefit does fully dynamic lighting give? Day/Night cycles? What game actually creates gameplay around that?
Dynamic lights should be reserved for little detail things.
>What benefit does fully dynamic lighting give?
Literally nothing.
There is not a single game with fully dynamic lighting on the market that does a single thing different from games that are a decade old.
8 months ago
Anonymous
What benefit does fully dynamic lighting give? Day/Night cycles? What game actually creates gameplay around that?
Dynamic lights should be reserved for little detail things.
8 months ago
Anonymous
You're keyed in on why modern game dev pisses me off.
>No bro the sun needs to be dynamic bro >What do you MEAN a directional global source that fakes sun rise and sun set positions would indistinguishable and take 1/10th the resources?! >Oh yeah no let's make all these swinging moving flickering light bulbs generic cast decals in this tight enclosed atmospheric room meant to evoke dread
8 months ago
Anonymous
>>Oh yeah no let's make all these swinging moving flickering light bulbs generic cast decals in this tight enclosed atmospheric room meant to evoke dread
See, the thing is, you don't even need a fully dynamic light system for that.
Fricking The Division had moving, swinging, and tiny light sources that all cast shadows in 2016, and that game also makes use primarily of baked lighting.
8 months ago
Anonymous
True but you don't understand, people might actually have to do real work instead of up porting assets from the last title : ( It would be inhuman crunch time. They wouldn't see their cats for upwards of hours a day.
8 months ago
Anonymous
8 months ago
Anonymous
Which Mirror's Edge is this
8 months ago
Anonymous
The original. I bet it looks truly special on an OLED screen.
8 months ago
Anonymous
Reinstalled, I loved the OG and it's about time for replay. It's a goddamn crime more games didn't adopt the gameplay and artsttyle
8 months ago
Anonymous
Games barely reach Mirror's Edge's quality of visuals, even today. This game ran on a console with 256MB of memory. What the frick went wrong?
8 months ago
Anonymous
Realistic graphics.
8 months ago
Anonymous
Mirror's Edge had realistic graphics.
8 months ago
Anonymous
theyd make this same scene today but run at 49 fps with upscaling and use 32 gigs of ram and take up 129 gigs of space
8 months ago
Anonymous
If those are the specs for PC, I'm worried about how bad the console versions are going to look/run.
>What the frick went wrong?
The chase for realtime vs. baked like ME.
8 months ago
Anonymous
ME is visually a very basic game - the reason it looks so good is very effective use of contrasting colours, usually white + something very bright. The actual texture detail is nothing special.
8 months ago
Anonymous
They understood how to make a game look good. There's not a single developer today who can pull off what DICE did then, certainly not with the same constraints of the PS3 hardware, or even the PS4.
8 months ago
Anonymous
unreal chinks colluding with nvidia chinks to steal gweilo money
since its a Black personpilled game no chink would touch it
8 months ago
Anonymous
It's baked lighting, the assets and geometry look like they belong in Half Life 2.
unreal chinks colluding with nvidia chinks to steal gweilo money
since its a Black personpilled game no chink would touch it
Nvidia sponsored ME and it had Physx in it.
8 months ago
Anonymous
Literally better graphics than any AAA released today (thnak frick TAA didn't exist back then) AND it ran on a console that was underpowered and absolute shit even when it released in 2005.
8 months ago
Anonymous
you cannot be serious
8 months ago
Anonymous
I am serious.
8 months ago
Anonymous
>(thnak frick TAA didn't exist back then)
Ryse used it in 2013 on the Xbone, MGSV uses blurry FXAA without the strength of actually removing jaggies like TAA does.
8 months ago
Anonymous
TAA is dogshit and has permanently ruined game graphics.
8 months ago
Anonymous
TAA doesn't remove jaggies, it just blurs them.
The only way you remove jaggies is by increasing the resolution.
MSAA used to do that for specific edges, but it's no longer an option thanks to forward rendering being dead.
8 months ago
Anonymous
8 months ago
Anonymous
8 months ago
Anonymous
8 months ago
Anonymous
Art direction and having a happy healthy competnet staff will do that, VS treating it like a revolving door worker mill where it's all pajeets and danger hairs working for peanuts and being fired before you have to give them bonuses or benefits.
That and all the wonder coder types who used to make game engines are dying off and no one is replacing them, And any giga nerd who well and truly learned how to code isn't going to touch the pozzed to frick volatile decaying market that is the post 2007 gaming industry. They're all working in government or on academia or medical industry shit by now.
We are going to experience horrendous technical competence brain drain on unprecedented levels when the last of the carmack types and MT Framework coders who still care to participate in the industry die off. Because no one can make anything from scratch anymore- Just betty crocker "Code slop in a box" and pre made module and nodes.
8 months ago
Anonymous
>medical industry shit by now.
Is this lucrative for IT shit? I have a Biomedical Science Degree and an Information Systems Degree and I just work a shit IT job.
8 months ago
Anonymous
Batman Arkham Knight and Assassins Creed Unity look much better than every modern game.
8 months ago
Anonymous
its funny people still bring up unity
the biggest piece of shit game ever that required the best gpu from 2y ago to run at 15fps
yeah, I have zero idea wtf Ganker is on in this thread. Sometimes it feels like Ganker reacts based on their initial emotional reaction rather than what's actually reality.
its a rule that if a game launches on series S it cant require that much on pc either. The devs will always have the settings for series S to implement on pc too.
Like cities skylines 2 with its shit perf only launched on pc which enables it to have truly bad optimization
>super old >3 years for the oldest gpu, same as current gen consoles >low end >$500 gpus having to run at 540p to reach 60fps
No wonder devs have no incentive to optimise, you'll literally defend anything
No, unfortunately consoles didn't normalize upscaling, Nvidia and their fanboys did.
Consoles have been doing upscaling shit since as early as 2008 with Halo 3/ODST.
This shit we see nowadays is recent, and came about with the rise of DLSS.
The thing that I hate about DLSS it makes characters looks like they're melting wax figures, granted the environments does look great I can't really tell the difference
Yeah old games can have decent looking color balance in a scene
they still look fricking shit when you get to a scene that involves animation, or god forbid facial animation with lipsync
Shit, this thing is really only 4-5 years old, I mean it's got a 2060 Super in it, but the PSU is too weak to supply power to a 30XX series or 40XX series card, though it has a decent CPU in it, a Ryzen 7 3700x. My next PC is at least going to have a 30XX series card, really see no reason to get a 30XX series card, seems a bit overkill. I'll try and get a Ryzen 9. One day I'll have a PC with the top of the line components in it, at least for a few years until new tech comes out.
5-6 years ago a 2060 Super was considered in the top three GPU's, now it's almost worth nothing in modern gaming eyes.
A 2080 was basically top of the line, so I don't know what you're getting at. If you believe someone should upgrade their PC every 2-3 years then you're delusional, I don't have that kind of cash to do that every 2-3 years, also it seems wasteful. I'd at least try to salvage what I can out of the old one. My current PC is at least worth $1200 or so, that's a lot of money for me. I can't just go and buy a $2500 PC right out of pocket like that.
>30XX series cards didn't exist 5-6 years ago
Neither did 2060S, your timeline is just fricked.
I'm sorry to hear you can't just upgrade but pretending that mid-high end GPU from 4 years ago is anything but low end is lying to yourself.
If not for moronic GPU prices it would be great time to build a new PC, but it is what it is.
>5-6 years ago a 2060 Super was considered in the top three GPU's, now it's almost worth nothing in modern gaming eyes.
Buying anything xx80 and xx90 was considered "future proofing" and over the top back then for almost everything released at the time, now the latest generation of those cards is recommended for all this unoptimized shit.
I spent $1200 on my last fricking month and I'm already unable to run ultra.
Devs don't give a shit about PC, especially nowadays. It's all about console. No AAA dev is thinking "oh shit we have to design our game around DLSS!".
Games run the way they do now because devs don't have to make them run on a base PS4 or Xbone anymore. So they just get these turds running on a PS5/XSX at the bare minimum acceptable (for a console lol) quality and call it a day.
>"oh shit we have to design our game around DLSS!".
Maybe not the devs, but I can well see corporate thinking "AI saves us a bunch of money".
Even when you outsource to India, an hour of dev time easily costs the company $50. At just ten devs working optimization that's $4000 a day. With proper specialists in America, you easily pay 10 times eas much. DLSS can easily save you a six or seven figure amount of money. And it reduces your time to market.
Sure at AAA scales that only comes down to a fraction of a percent, but that's the stuff that makes your stock price go up. Simple matter of fact is that once you're market share is big enough it's just better business to ignore quality and instead go for margins.
It's UNREAL how SHIT a fricking engine it is. Like we dog on unity but Unreal after... what the mid 2000s? is a sign that a game will be shit with god awful LOD, Performance and pop in and a no buy.
Unnecessary and overkill. a 3080 or 3090 is more than strong enough. You really only need a 40XX if you're doing animation work and design on your PC. The reason Alan Wake is requiring it for ULTRA performance is it because it's basically a cinematic movie game with over 10-15 hours of cutscenes I imagine.
You would only need a 40XX series GPU for very specific video games. a 3080 or 3090 would most likely run everything on ultra and ray tracing as well.
I can already push my 4090 to its limits in a number of games and I plan on selling it for 1/2 the price I paid for it and getting a 5090 when the 5000 series launches in early 2025
Holy fricking shit, didn't Nvidia come out with the 30XX series and then like 4-5 months later release the 40XX series, and NOW we have a 50XX series coming out anywhere from 8-15 months from now? Good god, slow down.
I MIGHT get a 40XX series, if I can afford to get one when I get my next PC, if not I'll settle for a 30XX series, though I see absolutely no fricking reason to have a 5000 series card when literally a small percentage of people don't even have 40XX series cards.
Nvidia is moving to yearly architecture releases. They will likely follow the Intel release model of small improvements each gen but hey, bigger number more better.
>Nvidia is moving to yearly architecture releases.
Is that really necessary? Is technology advancing so rapidly in the gaming world? It's like Apple releasing a new Iphone every year but the only real improvement is the camera.
8 months ago
Anonymous
>Is that really necessary?
For their bottom line absolutely - the OEM market fricking LOVES that shit and that is a market nvidia will not ignore.
>Nvidia is moving to yearly architecture releases
no, they're not, it's the opposite in fact
they have traditionally released a new GPU generation every 2 years, but now they're moving to a 2 1/2 - 3 year gap between releases
>Nvidia is moving to yearly architecture releases
That's literally the opposite of what's happening. They're lengthening the gap between new generations, not shortening it.
I'm not necessarily upset, it's time to upgrade me and it served me well.
But I'm not upgrading for this shit.
I might let it get me through the ER DLC and any potential AC DLC that comes out. Once those games are done and anything in between I can run, I'll likely upgrade.
Doom Eternal deserves more recognition for how well optimized it is. It's the only modern game I can think of where running it at native with RT on is actually viable outside of the absolute top end GPUs. It looks 10 times better than half these smeared shit games as well.
What's up with how games used to save certain graphic effects for certain parts of a game or a cutscenes and now instead just enable everything all the time?
Workflows are way, way more complex now. When you have the performance you don't want to make your development needlessly complex. It also has better feeling of coherency.
>5000 series cards >Look it up >It's real
Holy fricking shit, didn't Nvidia come out with the 30XX series and then like 4-5 months later release the 40XX series, and NOW we have a 50XX series coming out anywhere from 8-15 months from now? Good god, slow down.
I MIGHT get a 40XX series, if I can afford to get one when I get my next PC, if not I'll settle for a 30XX series, though I see absolutely no fricking reason to have a 5000 series card when literally a small percentage of people don't even have 40XX series cards.
2ish years between GPU/CPU generations has been the norm for the last decade or so? The odd thing with the 40 series was just how long it took to release every card, almost a year I believe. The 50 series should have been out at the end of 2024. AI boom screwed us over.
Only the 3090ti came out now two years ago. It was a special case since for 30 release we had a crypto boom and chip shortage. GPUs could literally print money back then, which is why the 3090ti double in price for like 5% extra performance.
And don't think that the series alone determines the performance. 4060ti and 3060ti are practically the same performance. Especially since the 40 series nvidia really screwed us over, many of those cards are actually a lower tier with a fake name and raised price. And they actually tried to bump everything up another tier with the "4080 12GB", but had to retract due to public backlash.
When you get your PC, head ove to /pcbg/ on Ganker, they'll help you navigate the bullshit.
How old was that video, because that's really not true anymore. Ti started at the very top, then became more and more common because people associated it with high performance.
You used to pay premium to get the best, the flagship could cost 50% for just 10% more power. So the best card was midrange. But that's over now, with 40 series the 4090 actually had the best price/performance on launch. It's linear scaling now, no value or bargins to be had.
Also, GPU names are pure marketing. You absolutely cannot trust them, the sales/marketing teams will use any good reputation a card earned to screw you over with the successor.
I have never, ever checked a req list in my 28 years of gaming as I have been gaming on console since fricking snes days kek
enjoy upgrading your pcs every 2 years homosexuals
>90gb >vaseline blur on everything >2060 RTX minimum for 1080p WITH upscaling (aka 1267x712 native) on lowest settings
The frick are they using the drive space for? Did they make a full 4K TV show for this for the TVs they'll pepper about that look like old tube TVs which thus rendering that 4K resolution moot?
I'd say from an art point of view Deus Ex Mankind Divided also looks a lot better than modern games. The character models are bad but the general art direction is great.
This. Exactly the reason why Yakuza 8 will probably be the last AAA game I ever buy, it's still being made for the PS4. After that, I just don't care anymore. I don't want to need upscaling just to run games.
That literally is more impressive than any game released in the past 5 years. Just having non-jaggy hair and liquid physics is more impressive than any AAA game.
>good hair >solid clothing detail >everything is clean about the appearance >good depth of field blur >stylized semi-realistic appearance instead of chasing photo-real
Looks better than most modern AAA.
Yakuza 6 is probably the best looking game on the PS4. Too bad it ran like shit so they downgraded their engine for all future titles. It runs so beautifully on even mid-range PCs.
RGG is at least AA and Sega is definitely a AAA publisher.
Rationally, unless the game heavily depends on spoilers or community involvment(e.g. MMO), there is no reason to buy a game newer than three years. On release games are just unfinished and overpriced.
Same thing true for hardware, the best you can probably do is get previous gen flagship a bit after the new stuff released. 3090s went as low as $500 middle of the year(literally every other PC component has great prices now, only GPUs suck).
I'm a neet, and i ain't dropping 300€ on a fricking GPU i'm going to have to replace in a year, because your incompetent diversity hires can't code, Remedy-shill.
>upscaling as a requirement
Jesus christ, now I know why they waited so long to reveal this.
Remember when people thought DLSS would be used to keep old cards viable for longer? That was stupid.
my 2060 is still viable. if i had bought a 1650 super or something, or some other card that was slightly more powerful but didnt have dlss i wouldnt be able to play
>or some other card that was slightly more powerful but didnt have dlss
Doesn't exist on the Nvidia side.
On the AMD side that would be an RX 6600 or RX 5700, both of which have aged far better than the 2060 did.
>both of which have aged far better than the 2060 did.
no because fsr looks A LOT worse than dlss
I'm talking about without using upscaling, anon.
It's the difference between needing to use upscaling and not needing to use upscaling.
ure gonna need to use upscaling on either of those cards to play new games
I forget the name of the principle, but essentially it goes that the amount of effort/power/cost to render something remains constant or goes up steadily because studios and devs use better hardware as an excuse to continue pushing for increased fidelity at the cost of performance, rather than staying in one spot and focusing on optimization. It will never change either because shiny visuals are an immediate in your face marketing asset.
If a game can't run on my 1080 Ti, I just won't play it. The 1080 Ti was the last real GPU without all this extra raytracing/upscaling crap tacked on to try and sell new hardware.
Frick RTX, frick DLSS, frick FSR, frick literally all these meme technologies that do nothing but make you buy better and better hardware for worse and worse looking games.
when my 1080ti dies
i will buy another 1080ti
This. They're less than $200 on eBay now. An absolute bargain for a GPU that can run literally anything that doesn't push memetech.
Greatest GPU ever made.
Another one to the pile of "this will benefit the consumer and definitely not go right into the pockets of executives" I guess.
1080ti is the most based card to exist I loved that little bastard.
you can still play at medium settings with no ray-tracing
this is the norm for every major release nowdays, why are you surprised
uhhh sweatie it's the new normal
deal with it
more and more devs are going to use that as a crutch
>every dev is already using it as a crutch
>and has been for the past 4 years
it's too fricked now
the age of native is over, chud
>the age of native
cool, so the US can finally shove the featherBlack folk out of our country and into the ocean? Fully complete manifest destiny?
It won't be /required/, just bump those requirements up a bit if you don't want to use it.
Upscaling has gotten good enough that in most situations it makes perfect sense to render a lower-res but otherwise prettier frame and then upres, vs using the same total level of resources to render a sharper frame with worse lighting etc.
Input latency from upscaling is a big issue. I don't really care about worse internal resolution/lighting, but going from a 4070 to a 4070 ti/4080 or a 5800X3D to a 7800X3D is a significant cost to circumvent bad optimization and run games without it.
Unironically told Black folk this would happen, if quirky 2D slop indie games are asking and pleading for a GTX 980 equivalent and a i5 then AAAs are going to REVEL in this shit. They no longer even have to bare minimum try.
>Hire the worst h1b hello sars coders imaginable for the lowest fricking rates
>Literally never have to optimize, or listen to "fanbase" complaints as long as they keep buying it
>Just upscale it bro, just fill in the missing frames bro becomes standard
>ADD zoomtwats who stare at a tiny screen all day don't and won't notice the smeary piece of slow shit when fortnite 2 and AsscreeSpihman comes out because their vidyogabes are all colorful jingly keys to them anyway
welcome to Jensens future
The real question is if it's WORTH playing Alan Wake II
Remedy are one of the only devs left who make games with soul so yes
>sold out to mircosoft
>sold out to epic
>soul
lol
Frick no. Quantum Break was mediocre, Control was shit, and they replaced Alan Wake with a black female character. It's completely over. Modern Remedy sucks.
Control was fricking good you weirdo.
And although I am pissed about the black chick, I hear the game is actually split ,50/50 between Alan and her.
Lady has a metal break down and the kitchen appliances are to blame also nobody likes her at work.
But boss but because
No it really wasn't. It was boring as hell. Neither the combat nor story were engaging at all.
You must be braindead downie, control was total fricking garbage. That anon is correct, modern remedy or remshitdy as I like to refer to them are sellout shitters that make shit games for plebs like yourself.
>control was total fricking garbage
Tell me how that last level was good in any way you shit eating homosexual?
>middling last 5 minutes of the game
>"REE TOTAL FRICKING GARBAGE111!!"
You sound like some tribal Fromdrone or novidya homosexual that never even touched the game? Control was far from perfect, it has some faults among which the ending was a primary one and not getting to explore dimensions with the Projector another, but it was a cool game and a fun experience for the most part. You gays trying to wage war on it on mongolian basket weaving forums is just static background noise noone listens to.
Whats fun about killing the same lame enemies over and over with a shit gun? You work for remedy lmao its the only way youd write such trash defending one of the worst games released besides muh ray tracing...
>can only nitpick the last few minutes of a massive game to prove a point
Imagine this anons sad existence
Lol nah, piss off you fricking autistic, sperging queer.
Imagine unironically defending a soulless corporation like Valve as hard as that anon, when they don't even make games.
What the frick are you talking about
That anon was pissing and shitting himself about Remedy being "sellouts" aka, taking Timmy's exclusivity.
You literally curl off shitlogs into your hand and then smash it into your mouth anon, youre a controlgay aka shit eating shitlord.
Seethe you fricking queer keep getting mad over something that isn't important lmao
Controlgays make me LAUGH at how pathetic they are lol enjoying that shiny floor homosexual? Woah....shiny....
>I hear the game is actually split ,50/50 between Alan and her
that's still too much not-Alan Wake screentime
I dropped Control cause the narrative and main character was boring me to tears. The gameplay was neat though.
>spamming the overpowered throw through the entire game while wandering through identical grey corridors
>good
I trudged through that fricking turd to get to more Alan Wake story only for the game to completely demystify it by going "yeah it was just a haunted typewriter all along". The entire game is like that, modern SCP tier writing trapped in a game with the color pallet and ugly, uninspired design of a hospital ward. Even the combat is a slog of boring guns and abilities that get rendered totally pointless when everything is a slow tanky bullet sponge unless you relentlessly use the single only attack that does anything. Get some fricking dignity and stop praising such contemptible shitware so Remedy can get back to making games for people with actual standards.
You summed up my experience with Control nicely
>Control was fricking good you weirdo
>replaced
so all the rest of this post is a lie too, right?
>Quantum Break was mediocre
it was one of their best games, i liked it better than control for sure
Big download
Gay actor
>Big download
>remedy makes a game that's also half-movie
to be expected
>gay actor
which one? doesn't affect the gameplay
qb's late-game combat is a fricking blast but it takes a while to get there, that's its biggest sin
The protag is gay irl and an activist
that sucks, lots of people liked him from animorphs
i don't know how either of those points make the game itself mediocre though
If you know how much an idiot he is...
You wouldn't like seeing his face either.
You just can't stand him and ruins immersion
I prefer my games to not have real people and even voice actors are better when they are nobody.
Well you're gonna see him again.
I won't I don't torture myself with AAA these days
Went full retro and I'm happy
>Gay actor
So? Neil Patrick Harris is gay and he's a pretty cool guy.
You sure about that?
Because I've never met a cool Lefty
Literally vax shill
>antivax in 2023
Go inhale some polio if you're so confident in your own immune system
Oh wait you can't because vaccines deleted it in the first world
My gf died while taking it frick off
Montichiari Italy she was 27 and healthy
>Go inhale some polio if you're so confident in your own immune system
>Oh wait you can't because vaccines deleted it in the first world
schizo
>Go inhale some polio if you're so confident in your own immune system
>Oh wait you can't because vaccines deleted it in the first world
>schizo
Meant for
Polio was fake and gay, it was DDT poisoning. "Polio" disappeared around the time DDT was discontinued and banned. It's easier to go OOPS a virus out of NOWHERE happened than to pay out settlements and insurance claims for chemical company frickery.
>Frick no. Quantum Break was mediocre, Control was shit, and they replaced Alan Wake with a black female character. It's completely over. Modern Remedy sucks.
Quantum Break > Control > Alan Wake
fresh out of slopfen made in sloppistan
>4070 with PERFORMANCE DLSS for 60 FPS
frick off Black folk I aint buying shit
4070 is obsolete card now.
Path tracing on or something?
B…but anons said that 40xx-series is future proof!!
games cost $70 now and somehow optimization has gotten worse
It didn't. It's just optimized for ripping of morons with too much money.
Nobody would upgrade their hardware if shit was optimized properly. Whole industry is in cahoots.
So the deck has no chance
Isnt this game literally corridor horror? I What the actual frick is going on?
no it'll be out in big open woods, just like the first game
Requiring a 3070 to run the game at 1080p60 is a sad state of affairs. Have they ever looked at steam hardware survey?
looking at the image it seems to imply that it's not even 1080p, it's 1080p DLSS performance, lol.
PC players btfo
I doubt it will run much better on console. Locked to 30 fps.
Good. More PC games should make it so you can't be poor to play their titles. Graphics and games as a whole are held back by consoles and low end PCs.
You guys don’t have a 4080? You know it’s 2023, right??
I thought this was ok until I saw DLSS was turned on in every requirement
After how hard they pushed DLSS in Control I'd be amazed if AMD cards can even run it properly. Even Control on a fricking RTX 4090 at 4k with DLSS off only gets 70 fps.
No but it'll run on my PS5. Replayed AW1 and American Nightmare earlier this year, played Control and just finished the AWE expansion today. Incredibly hyped for this game. Pointless trying to discuss it on Ganker because
>added a black woman
>epic exclusive
But for me it will likely be GOTY.
I’m gonna play it too, but GotY? I’ll be pumped if it gets over a 75 metacritic
Previews have all been incredibly positive and the small amount of footage I've allowed myself to watch has looked great. I have faith in Sam Lake and the rest of Remedy.
you sound trans
Of course the LA mafia will review bomb it, but people with actual fricking taste will recognize it as KINO.
>fake casey keeps dying and coming back to threaten alan at gunpoint
Kinda funny
WHACK HIM
I love how even in the silhouette shot you can tell it's Max Payne.
>GOTY
I'm very hyped too anon. I played Alan Wake for the first time since release on ps plus, and enjoyed it thoroughly, way more fun than I remember it being.
Encouraged me to play through Control+dlcs and quantum break (meh) and am eagerly anticipating next weekend
I'm really interested in the game, but I hope it has three things
>good enemy variety
>good weapon variety
>interesting combat locations
These have been pain points for a lot of Remedy's games even if I do like them. Control was really disappointing with its enemies choices considering it feels inspired by SCP.
>Replayed AW1 and American Nightmare earlier this year, played Control and just finished the AWE expansion today.
I haven't even played Alan Wake 1, or even any remedy game, and I also think that it'll be GOTY, that's how special it looks, to draw in somebody like me who previously didn't care less...man that IGN gameplay review was a godsend
>not calling it "Woke II"
Missed opportunity
>tfw 3070 and Ryzen 7 3700x @1440p
My time is nearing its end.
>You need a 3070 to play at 540p 60fps
the absolute fricking state of this industry
Looks like my 3070 is just BARELY hanging on for 1440p.
I fricking hate how shit this gpu got so fast, or I hate pc optimization more i guess.
its not bad pc optimizaiton. ur card is almost 2x as powerful as the xbox series x gpu. it runs at 30fps. you run at 60fps. its not "bad pc optimizaiton"
IS IT REALLY 30 FPS ON SERIES X?
FRICKING L O L
the same is true for most of these games that are accused of having bad pc optimization. like starfield, it is a very demanding game on both pc and console, runs on 30 fps on series x.
this is not a new phenomena, gta 4 is an old example of a game that runs like shit on pc, but even worse on console (although pc version was broken on release but got patched well)
I can somewhat understand Starfield because of how much data it has to process and the unreliable engine but Remedy used to be incredibly good with their own tech in Max Payne 1 and 2.
fair enough, but are they still using an inhouse engine?
Yes, it's an updated version of the engine they used for Control. Remedy always build their tech in house. It'll be a sad day if/when they decide to jump to UE.
As far as I know they're still using their "Northlight" engine which was also used for Control and Quantum Break. Prior to that was the AWE for Alan Wake and MaxFX for both Max Payne games.
Series X is like a 2070 equivalent, no shit it's 30FPS, expect 30 to become the norm for consoles now that we're out of the cross-gen period
this happens every gen. snoy and microshit promises 60 fps. a couple of first gen games on the new console run at 60, cross gen games run at 60, remasters run at 60. about 2-3 years in virtually all games run at 30
its been this way since the PS2 gen. mgs2 looked amazing and ran at a smooth 60fps. mgs3 was capped at 30 but most of the time it was more like 20 fps
I dunno what you're talking about, there is a 60fps mode on consoles. Remains to be seen how bad image quality is when using it though.
They said performance mode and don't expect a solid 60fps on Series S
>I dunno what you're talking about, there is a 60fps mode on consoles.
Yeah by cranking down resolution and settings, the same fricking thing you can easily do on PC
It has both 30 fps and 60 fps modes
>its not "bad pc optimizaiton"
>3070 recommended for 1 0 8 0 P
>DLSS performance ( 5 4 0 P)
>its not "bad pc optimizaiton"
you're a cuck in real life too ?
It's not it's 0 optimization
b-but the graphics are amazing...
>b-but the graphics are amazing...
cool last gen game
The material quality of the FBI jacket looks pretty good
This looks barely any better than Control and I played that with like 45-60 FPS on a 970 with a garbage piledriver CPU.
Optimization is truly a lost art.
The games industry has suffered a complete brain drain since the scamdemic.
its even worse on console. the game is either unoptimized in general, or it is just using very demanding graphical effects. very very few games run fine on console and have "bad pc optimization"
>3070 is almost 2x as powerful as the xbox series x gpu
lol
>These reqs
>EGS only
This is gonna flop like a motherfricker on pc
It's going to flop everywhere. It's been so long since the last game and it was Xbox 360 exclusive at release for 2 years before it came to PC and then like a decade after that for a shitty remaster on modern consoles. No one is starving for games at this point of the year either.
>RTX 2060(is the minimun)
I'm going to cry
yes for rtx max even
t. 4080 owner
>pay 600$ for a 4070 to play at 60 fps at 1440p with fake frames aka DLSS
>at 4k with gay tracing you need a $1000 GPU and dlss to achive 60 fps
so tired of this garbage industry
>cant even pass the minimum
Almost as woke as Baldur's Gay 3 so it will do really well. homosexuals will fellate themselves. trannies will spontaneously regrow their penises so they can frick the frog thing. It will be a veritable utopia!
Is your taste bad enough to run alan wake 2 on your computer?
Yes but I can't
Good enough to run it on high but I'm not supporting this dogshit.
I’ll probably just watch it or play it much later with patches since I’m saving up for new parts
to get 60fps with a 3070 you need to be playing with DLSS performance at 1080p
So meaning literally at like 560p. 3070 needs to run this game at below 720p to get 60 frames. Damn I love gaming so much bros.. thank god THIS is where we're at.
Why did he do it?
>It's not a Jarvi
>It's an ocean
>hes behind me, isnt he?
Sami and Remedy really went to shit. It was good while it lasted.
I will only coincide defeat after 27th
This is really disappointing. I wasn't expecting it to run on a toaster but 560p on a 3070 to get 60fps is just tragic.
as much as I want to play this day 1, I think I'll wait it out a bit and see if they patch it to run better, this seems insane.
Won't happen. Control uses DLSS as a serious crutch to this day.
nice, something to put my 4090 through its paces 🙂
The game looks pretty incredible so its not surprising that its demanding, but it is surprising that it is THIS demanding
>rains droplets on a third person camera
Why? That's just annoying
>rain drops on a first person camera
Still doesn't make sense.
oh so there's a physical camera behind Alan
Got it
On screen rain drops never make sense in any situation. Only makes sense on actual transparent glass or plastic surfaces your character is looking through. Same with lens flares that don't happen in human eyes, only in cameras.
Eh: Gives the game more of a cinematic feel, I honestly kinda like it but I can see it being percieved as "being in the way" or something.
i dont want to pretend im watching a movie in goynema. im playing a game
Control was still pretty good looking even on low settings, i hope that's the case here too. Not that i'm not expecting terrible performance anyways.
Looks like a slightly better RE2 Remake
This doesn't look like anything that couldn't have been done on the PS4.
Really? Can you show screenshot of a PS4 game that has similar level of visual fidelity?
Well, I accept your concession.
goddamn that is super fricking blurry
That's because it's a PS4 screencap upscaled from 900p to 4k.
For comparison, here's the PS4 Pro version, which I think runs at 1440p or thereabouts.
Why is your image so blurry? Well, I know the answer but goddamn.
RDR2 looks better than this, runs better than this and is open world. There's no excuse, especially with RE4make coming out the same year.
RDR2 looks incredible because it has god tier art direction and lighting and attention to detail. Up close it doesn't really hold up in a way a next gen title does.
I'm not arguing against you, I think RDR2 remains the most beautiful game of all time, but if you get too close to anything the low res textures become pretty apparent.
looks about the same as re:make2 and that runs on everything that has a spinning fan
Looks just like Arkham Knight without a massive open world
>a 3070 to run the game at 60fps in 1080p/Medium with Performance FSR turned on
What the frick is going on with the vidya industry?
uh, nothing...
Upscaling was a mistake.
This. I knew this was going to happen the first time I saw DLSS (Think it was FFXV for Windows and it was DLSS 1 that only worked at 4K resolution or higher)
"It's just for 4K" they said.
"It'll make cards last longer" they said.
It just makes devs get away with being lazy.
Remember, anyone who dislikes dlss or points out any flaws MUST be an AMD fanboy because why else would anyone speak bad about our lord and master, Nvidia!?
FSR sucks too. Upscaling shouldn't be needed to run games on any GPU from the last 3 gens. It's a nice idea for running modern games on really old GPUs, but to require it at all levels? Frick that.
I will never buy a game that requires DLSS to run on current gen cards. Another company to add to the blacklist.
>1080p medium settings DLSS Performance (540p) without ray tracing to run at 60 fps on 3070
>Ryzen 3700X
Excuse me?
That sounds worse than Starfield.
kek they really went full on RTX didn't they after control, i mean i pass the requirements but come on.
>mandatory DLSS/FSR
This has to be a joke. The game doesn't even have good graphics. It looks way worse than say, RDR2, which came out FIVE YEARS AGO and ran on a base PS4.
Imagine the GTA6 PC requirements whenever that drops.
>play on console
>it just works
Yeah, at 20fps 720p.
You just know that Digital Foundry will shill it as "true next gen" experience.
For some reason, everyone treats Remedy like a holy cow.
Truly the most overshilled studio.
Actually Digital Foundry made quite a lot of fun of Control for performance. They even developed their own "corridor of doom" stress test for a specific corridor in the game where the framerate gets inexplicably raped.
>RAM: 16Gb
I might actually be still be in the running
>Medium 1080p
>RTX 3070/ RX 6700 XT
>DLSS/FSR2 PERFOMANCE
Holy shit, this is bad.
Name one good looking game released this year with decent optimization. Go.
>Resident Evil 4 Rem-
Don't you fricking dare say that.
ake
>what is 2+2
>and don't say 4
Hi-Fi Rush
HiFi Rush has pretty basic graphics though. I will admit it runs incredibly well.
>HiFi Rush has pretty basic graphics though
You weren't talking about "complex" graphics.
HiFi Rush and Clash: Artifacts of Chaos, simply because they went for style over all that Nanite/Lumen/Photogrammetry/Face-scanning/Debra Wilson shit.
It worked
Exactly and you also know that it's HFR just from looking at it.
Meanwhile I wanted to google a screenshot for Immortals of Aveum as "this unremarkable shit runs like garbage" example and the game is already so forgotten that when I finished typing in "Immortals of" the suggestions were entirely for Immortals of Meluha from 2010 and I honestly don't even know what that is.
Lies of P
Dead Island 2
Witchfire
Armored Core 6
>Lies of P
True, true
Shocked nobody said Atomic Heart
Lies of P
AC6
RE4R
Very based, finally 1060 gays will stop lying about how their GPU is playing everything "just fine".
should be good enough
I was so fricking excited for this game, this sucks.
CRT gays are eating good, 1024x768 is still higher internal resolution than 1080p DLSS perf. KEK.
Alan Wake was so boring
Buy Energizer batteries.
>60fps across the board
If you're still playing at 60fps in 2023 I just feel bad for you
Ganker was right about upscaling and its consequences for mankind
I'm sorry for laughing
What a bizarre and confusing way to show the specs
why is "low" resolution 1080p albeit
It's not 1080p, notice the upscaling recommendation, upscaling at quality is actually 720p
>morons ITT blaming upscaling
If DLSS and FSR didn't exist you'd simply have to play at the same lower resolutions but without upscaling.
You are crazy if you think developers would bother optimizing, especially on PC. The reason they're not doing it is simple: they don't have to take PS4/Xbone in mind anymore and just let PS5/nu-Xbox brute force the game, consequences be damned.
Developer take the path of least resistance. Optimising the game probably takes 3-4 months just going through the code to rewrite it more efficiently and optimise the models LoD to scrap a few frames there and there.
Now they just run it once with DLSS and if it’s not catastrophic they just release it.
DLSS/FSR required = NO BUY
how would it be required?
It literally is when a 3070, a high end GPU that costs $500 (same as a PS5/SeX) can't hit 60fps 1080p on low settings without it.
its figuratively required for the performance boost it gives you
but given its a slow game you can easily do with 30 fps
No, have a nice day.
1080p30 on a $500 GPU without even using RT is dogshit.
turn setting down then
Fricking
Kill
Yourself
buy a new gpu
have a nice day.
Damn you're a real slop hog. Aren't ya chief, mmm yummy slop.
what do you even want?
See
That gif doesn't really work for EGS exclusives
Devs don't give a shit about PC, especially nowadays. It's all about console. No AAA dev is thinking "oh shit we have to design our game around DLSS!".
Games run the way they do now because devs don't have to make them run on a base PS4 or Xbone anymore. So they just get these turds running on a PS5/XSX at the bare minimum acceptable (for a console lol) quality and call it a day.
What's with all these horribly optimized games coming out recently?
Most of these games can basically double their player base if they optimized a little more, it doesn't even make sense financially.
See
You are seeing some games run at 720p internally on PS5/XSX. A lot of games in the 900p to 1080p range.
Before they'd target 720p-1080p for Xbone and PS4, and so the games would run at much higher res and fps on PS5/XSX. Now that the old consoles are not developed for anymore, those resolution targets move on to the PS5/XSX. No further optimizations needed. Especially since most games are developed with consoles in mind, devs just don't give a shit.
People don't want to talk about this but the hardware israelites do this on purpose to make mustardracegays buy new GPUs every year. That and new devs are so useless they don't know how to optimize games.
>unoptimized ESG slop as permanent EGS exclusive
not even gonna waste time pirating this trash
>90 GB
Dropped
Exactly, this means the game is short a frick.
....
:^)
>60FPS DLSS PERFORMANCE
HAHAHAH FRICKING have a nice day RN
I love that more powerful hardware is easily accessible for the vast majority of people these days, but man do I miss the wizard tier shit that devs had to do to make older hardware really work. N64, the PS1, PS2, etc. Devs had to basically make a pact with demons to squeeze every last bit of performance out of a console's hardware. Now that RAM is measured in GB instead of KB/MB, and 8+ core GPUs are standard, it seems like devs just got fricking lazy.
Even former "tech wizards" are shitting the bed these days.
Just look at Croteam and their upcoming Talos Principle 2.
The fricking title screen image that just moves around a bit is a video file that's half a gig in size, the same as the first Serious Sam game, and the game runs like ass in comparison to what's been presented thanks to the usual UE5 issues.
And to top it all off, it's 100 fricking Gigabytes which is bigger than all of their previous games combined, including SS4 that was already considered obscene at 40 - all this for a goddamn puzzle game.
I think it's a combination of being forced to adhere to crazy release schedules as well as they used to utilize their own in house engines basically. A lot of games used pre-made engines now, so they have to deal with the bloat inherent to the engine. My favorite example is how they managed to put Resident Evil 2 on the N64. Fricking magic
Blame the publishers, games are just released on the schedule in beta state. Devs clearly know the state of their games but they have to launch it anyway.
Even a month or two of extra polish for 100+ team is a lot of money, even if they hire pajeets.
>1080 60fps is more demanding than 1440 30fps
>mfw I'm still on 1080p
Maybe true, but kek anyway. This bottleneck is great, otherwise they would be pushing 8k right now
ignoring bottlenecks, 1440p 30fps should be roughly as demanding as1080p 60fps, because ur rendering twice the amount of pixels
Agreed.
I find it hilarious for AAA to push these specs (poor optimization of), a 3060 aka an almost 400 VC is barely fit for minimum requirements?
Also, wtf is the le justification for this bullshit? Is Alan wake streaming a whole city or million of npc at the time? Gameplaywise this is not more complex than fricking original re4
Evidently not but my PS5 is.
How can the PC mustard race cope with the current state of the industry?
>My rig uses slightly less upscaling than your console
Isn't really that much of an own
Personally i couldnt give a lesser frick about the "current state of the industry" because im not goyim. PC is the best platform because it actually lets you access all the gaming greats while consoles barely give a shit about the libraries that made them what they are
Unrelated, but a shitton of PC gamers still play their games on HDDs. Why is that?
Low IQ, no other explanation.
For many years it was standard to boot from SSD and have games on HDD for the larger storage. When SSD prices fell some people had a boot drive SSD and a game drive SSD. These days NVMe is so cheap it's becoming more standard for boot drive and game drive. The limiting factor now is that motherboards typically only have 1-2 NVMe M.2 slots.
It's hard to notice the difference between SATA SSD and NVME SSD in gaymes.
Access times are the most important not maximum bandwidth.
The gains with NVMe are small but the cost per GB is lower so you might as well go with the cheaper faster option.
Sure, but 4TB and higher capacity SSDs are still overpriced so you can consider multiple smaller ones instead.
My motherboard has only two m.2 slots so I still have to use SATA for HDDs and SSDs for hoarding.
You can get 2TB NVMes now for $120 and still have a massive HDD for hoarding other content like video.
I have other HDDs outside of my PC as well.
Small SSD is for linooks dual boot.
Optane makes HDD feels like SSDs
HDDs are still fine for all the smaller games with exceptions.
Shame there isn't many SSHDs being made, I liked my hybrid but the main downside is that it sometimes isn't even utilised properly.
>remedy fan
>big alan wake fan
>big survival horror fan
>only person I know who was hyped for the game
>not sure I can even run it at a stable 30fps
>epic exclusive
Do they not care that their crazy graffix and moronic requirements lose them sales? would they rather have the prettiest game or the cash in my pocket?
I hope they got A LOT of money to sell it exclusively on EGS because they won't sell any copies on PC.
it's less that they got a lot of money and more that they got to make the game whilst retaining ownership of the IP
people can hate on EGS, but it's undeniably a good publishing arangement for Remedy, especially after they lost the rights to Max Payne and Quantum Break with previous publishers
I know this is only timed exclusivity the same way as Control.
I don't hate EGS any more than I hate Steam. I just think vast majority of PC gaymers use steam exclusively and even those who use EGS will think twice about buying this game after recent UE5 game launches.
Epic paid for Control exclusivity
Epic paid all development costs for Alan Wake 2
apples to oranges, it's never coming to steam
I wish good luck to Timmy then.
Valve fanboys are real.
>I know this is only timed exclusivity the same way as Control.
I give them two years before they change their tune.
Epic isn't doing so great right now.
On the flip side, Alan Wake Remastered never gave Remedy a dime outside of covering dev costs. I expect AW2 to play out the same way.
It'll eventually come to Steam whenever Remedy buys back rights to the game, but you're talking about an extremely long wait (as in >5 years).
it's not timed exclusivity, Epic are the publisher for Alan Wake 2, whereas Remedy self published Control and had a timed exclusivity agreement with Epic
AW2 will never come to Steam
Never is a very long time. They can still make AW2 Definitive edition or some other shit and launch it on steam.
I forgot to mention, I know they are using their own engine but I was thinking more about high hardware requirements similar to UE5.
>lose them sales
They already got paid. Same thing happened with Control exclusivity.
Dumb question but will a 3080 10GB run this ok at 4K DLSS no rt. I don’t know how well my gpu compares with 4089
maybe at low or medium settings
iit says there you need a 4070 for high settings 4K dlss performance
Im not sure what the poorgays are worried about
if you spend 500$ on a used 3080 you could run game ultra without pt at 1440p
???
>just spend $500 for one game youre on the fence about, for a gpu someone else has had sex with. And then also replace most of your pc as well because of compatability issues
maybe an xbox series S is a better fit for u
it's pc fats too so you know someone has actually hotglued that 3080
>semen cooking in the fins when it hits max temps
conceptualize the odor
You would have to be completely fricking moronic to spend anything more than $250-$300 on a used GPU.
maybe an xbox series S is a better fit for you
PC gaming is dead.
it doesn't need to be, it isn't out on pc yet, since it was delayed...
real shame
Quantum Break was 5/10 and Control was 7/10. No expectation of this one and the sheboon doesn't help
I don't get it why they never put QB on a decent sale though? They routinely give away Control with all the dlc for like 10 bucks, which is tbqh a pretty good deal, but a mid game like QB is still like 15 bucks on Xmas sales, like a decade later.
>no high 1080p shown to confuse people with ultra gpu requirements
disingenuous sellout fricks
>tfw 980ti/i74790k
I hate pretty females
>is your computer good enough to run alan wake 2?
i5-4670, GTX 970, 16GB and MX500 sata SSD master race reporting for duty.
Oh no...
Don't worry, there's not much gameplay in modern games so as long as my PC can run 4K YouTube videos I'm fine.
>Requirements account for the DLSS/FSR mode you've got to select to play
The gaming industry is so fricking far gone it's not even funny anymore, they're literally expecting me to drop 2k€ for a fricking GPU just to play this year's games.
Bold of you to assume it won't need DLSS balanced to achieve 60fps
They're literally saying you've got to enable performance mode on ultra to play at 4k 60fps on a 4080.
An old man sold me his son's 2080S for a dozen bucks. He didn't know what it was.
LESGO AN ALAN WAKE 2 THREAD
anyone have any guesses whether this shit will work on steam deck?
Just take a look at those requirements in the OP, friend, that should tell you everything you need to know.
Valve will sometimes do magic to optimize games for steam deck...hopium. This is literally the only game out of this entire year that I care about...
Dude...
>anyone have any guesses whether this shit will work on steam deck?
You have a lot of optimism
How did they frick it up so bad? I can play RDR2 at 4k ultra settings with a 4070 and get mostly 60fps. It also looks way better. This is non justificable.
Yes of course, do I look like a poorgay who can't even afford 12GB of ram or even a budget 4070? LMAO
Surprisingly modest CPU requirement with how CPU heavy a lot of recent AAA releases have been
>60 fps with DLSS on
>on a 4070
What madness is this? Why wopuld I want to turn on that crap? Why can't a very high end pc I bought last year natively run a game at 100+ fps anymore?
because all games are made by shitskins now
welcome to a new age of gaming
because it's the new generation, you guys wanted no more cross generation games so here you fricking go
upgrade or perish
>Upgrade
>Perish anyway
>upgrade or perish
Black person I have a fricking high end pc, there is barely anything left to upgrade.
>you guys wanted no more cross generation games so here you fricking go
>you guys
A so you are that special snowflake who is sooooo not like everyone else.
>there is barely anything left to upgrade
b***h you have a 4070 what the frick are you talking about nothing left to upgrade LMAO
I never said anywhere that I have a 4070, you just assumed because you didn't actually read what I typed.
Dumb homosexual.
It's easy: If a game is released for PS4 (even at shit quality) it will run on any reasonable gaming PC at 60fps+ with no upscaling crap.
Only buy games that release for last-gen consoles and you'll never have this problem.
3060 bros... I don't feel so good...
Why?
Repeat after me
>Human eyes can't see more than 24FPS.
>XX60
Deserved.
>epic exclusive
>asinine system reqs
Enjoy selling frick all for copies kek
Yes and I will pirate it lol
Assuming there is no drm
>buy 7900xtx
>use it to play decade old games at 4k 120
Feels good man.
thats a lot of incorrectly lit pixels coming at you very fast
das it mane
AHHHHH, why can't modern games have clean air. I'm tired of the smog everywhere.
but muh volumetric fog
NO!
Give me my beautiful games back.
Not every game is taking place in Silent Hill.
I haven't played far cry 3 but it's probably the better game here
that game is 11 years old and still has a modern look to it
really pathetic how little graphics have improved
in the past every year was a revolution, now there's barely any difference in 3 or 5 - and even something a decade old can hold up fine
I wonder if switch 2 having all games be upscaled via dlss makes the morons switch their tune about it
What does that have to do with PC?
No.
>1080p with DLSS set to performance
I tried DLSS in Death Stranding for shits and giggles (because that game's actually optimized well) and I'll never use it again. Made everything more than 10ft away blurry as vaseline and I only went from getting 120fps to 135fps. Shit technology that only looks good in super compressed YouTube videos.
It's designed for 4K displays. 1440p is good with it set to quality only. At 1080p you shouldn't need it and definitely shouldn't use it.
Then it should stay at 4K, instead we have devs requiring it to hit 60fps on 4000 series cards at 1080p medium settings.
DLSS1 could only be enabled at 4K and above. Nvidia should've kept that limitation so we wouldn't be in this situation today.
that I agree with. They've clearly given little to no attention to the low-medium graphics options if you need a 3070 to run it at medium 720p.
Not really, it's designed for high ppi displays.
If you have a ppi high enough for the viewing distance you're at, you can use it.
You can absolutely use DLSS Balanced on a typical 1080p laptop display. Even Performance on a small one (14 in).
you are wrong. for it to work well it needs a decently high amount of pixels to upscale from. on a 4K display DLSS Performance is upscaling from 1080p, so the internal render already looks pretty good. But when you only have a 1080p display it's having to upscale from ~800p or even 720p, which looks like shit. It has to work a lot harder to produce a decent looking image but it's always going to result in artefacts & ghosting. it's why so many people think DLSS is shit, because they're using it wrong.
It's all about ppi. Everything is proportional.
Yes it extrapolates less information at lower resolutions, but it also has to fill a lower resolution display. A 1080p panel at 15.6 in looks just like a 32 in 4K panel.
youre right about ppi but that isnt really what we're talking about here. Just because you have a high ppi display doesn't mean a 700MB YIFY movie is going to look good on it.
So does my 2060. Still not going to buy it. Frick devs that rely on the upscaling crutch.
you need a 4090 to see this
God, this game is going to be so good isn't it...the level of detail in all the environments man
did you copy and paste this from a youtube comment
>alan wake game where you barely play as him
>EGS exclusive
>requires a high end PC to run it at 60FPS even though it probably looks like blurry TAA shit
go on lad this pirated copy is on him
I dont get complaining about a late 2023 AAA game requiring a 300€ gpu to run at medium settings 1440p
Like those with shit pc's know they dont play AAA games? It doesnt surprise them.
Except you aren't running 1440p medium settings at 60FPS with 300€ GPU in this gayme.
It's 540p 60.
thank god for dlss to make it look good
DLSS is nice but it can't perform miracles. What is the point of all this fancy tech if you get blurry mess as end result. All this fine detail and hard work will get turned into mush.
not really, it works really well at 1440p
960p -> 1440p is decent, but balanced and performance 1440p DLSS look like crap. 540p -> 1080p will be just vomit all over the screen.
What's the point of spending all this money to make a fancy looking game if:
>On consoles it will run at 540p 30fps so no one will be able to appreciate the graphics anyway
>Most PC gamers won't even be able to play it
>The ones who can will choose not to do so because you took Timmy's EGS bribe
I do not understand.
so that it looks good in nvidia's marketing. that's why they throw money at these devs to use their RTX tech.
Why not just use bullshots?
It disgusts me that games are made with DLSS and FSR as mandatory. It gives devs an excuse to make 0 effort to optimize their fricking game.
They're not mandatory. If DLSS didn't exist they'd run the exact same, just without any upscaling to help the image quality.
Anon, are you moronic?
Do you not know how DLSS works?
Are (you) moronic? In what world is DLSS mandatory? It's a completely optional feature. Games would run the same way they are running now if DLSS didn't exist, the only difference is they'd look worse.
Protip: AAA games are designed around consoles, not PCs.
>inb4 my 6gb 1060 runs it just fine
Can't wait, because I know those posts will show up.
>Summer Glau never did a properly topless scene when she was in her prime
why even live
It's better this way, trust me anon
🙁
god i love her dress so much
Not this time, bro, not this time.
y-you're lying. I love my 1060!
Why does Control filter so many plebs? It's an extremely well made game.
>reee upscaling is standard now
yeah just like shaders. and cubemaps. and ambient occlusion. and screen space reflections. and LOD. and
Those are meant to enhance graphics, not literally brute force dogshit you out of a proper native resolution as the screen seizures from blurry to clear
no, they're techniques to produce a better image without negatively impacting performance too much. pretty much every single aspect of raster graphics fits that bill, they're tricks and hacks. Upscaling is just another tool in that arsenal, why wouldn't devs use it?
Except those make my game look better.
Upscaling makes my game look worse and is only necessary because performance is shit.
>Upscaling makes my game look worse
Now compare 4k DLSS performance to 4k native you stupid c**t.
apples and oranges. nobody is stopping you from running your games native.
You're a fricking moron, I hope you get raped to death.
Fricking ESL shills like you don't deserve to continue drawing breath on this earth.
aww poor baby cant listen to reason
>You're a fricking moron, I hope you get raped to death.
>Fricking ESL shills like you don't deserve to continue drawing breath on this earth.
>Disingenuous fricking israelite.
>If they STOP optimizing so you CAN'T run it native regardless of hardware because they can and WILL crutch on this. You literally have no choice.
schizophrenia
>Nobody is stopping you : )
Disingenuous fricking israelite.
If they STOP optimizing so you CAN'T run it native regardless of hardware because they can and WILL crutch on this. You literally have no choice.
>duuuude just optimise :^)
you have literally no idea what that even means
Neither do fricking developers after 2016 apparently.
AAA video games in 2015 look better than ANY AAA title today and they ran on hardware 10x less powerful with no upscaling.
post a screenshot of one example.
see here's where these arguments fall apart because this game uses fully baked lighting while most games today use fully dynamic lighting. You cant compare the two. The Last of Us Part 2 looks miles better than this, in part because it also makes heavy use of baked lighting.
>The Last of Us Part 2 looks miles better than this
AHAHHAHAHAHAHAHA
Dude, some of the environments in TLOU2 look barely better than a PS3 game.
I think that looks really good?
Then it goes without saying that you're an idiot whose opinions can be ignored.
>What benefit does fully dynamic lighting give?
Literally nothing.
There is not a single game with fully dynamic lighting on the market that does a single thing different from games that are a decade old.
What benefit does fully dynamic lighting give? Day/Night cycles? What game actually creates gameplay around that?
Dynamic lights should be reserved for little detail things.
You're keyed in on why modern game dev pisses me off.
>No bro the sun needs to be dynamic bro
>What do you MEAN a directional global source that fakes sun rise and sun set positions would indistinguishable and take 1/10th the resources?!
>Oh yeah no let's make all these swinging moving flickering light bulbs generic cast decals in this tight enclosed atmospheric room meant to evoke dread
>>Oh yeah no let's make all these swinging moving flickering light bulbs generic cast decals in this tight enclosed atmospheric room meant to evoke dread
See, the thing is, you don't even need a fully dynamic light system for that.
Fricking The Division had moving, swinging, and tiny light sources that all cast shadows in 2016, and that game also makes use primarily of baked lighting.
True but you don't understand, people might actually have to do real work instead of up porting assets from the last title : ( It would be inhuman crunch time. They wouldn't see their cats for upwards of hours a day.
Which Mirror's Edge is this
The original. I bet it looks truly special on an OLED screen.
Reinstalled, I loved the OG and it's about time for replay. It's a goddamn crime more games didn't adopt the gameplay and artsttyle
Games barely reach Mirror's Edge's quality of visuals, even today. This game ran on a console with 256MB of memory. What the frick went wrong?
Realistic graphics.
Mirror's Edge had realistic graphics.
theyd make this same scene today but run at 49 fps with upscaling and use 32 gigs of ram and take up 129 gigs of space
If those are the specs for PC, I'm worried about how bad the console versions are going to look/run.
>What the frick went wrong?
The chase for realtime vs. baked like ME.
ME is visually a very basic game - the reason it looks so good is very effective use of contrasting colours, usually white + something very bright. The actual texture detail is nothing special.
They understood how to make a game look good. There's not a single developer today who can pull off what DICE did then, certainly not with the same constraints of the PS3 hardware, or even the PS4.
unreal chinks colluding with nvidia chinks to steal gweilo money
since its a Black personpilled game no chink would touch it
It's baked lighting, the assets and geometry look like they belong in Half Life 2.
Nvidia sponsored ME and it had Physx in it.
Literally better graphics than any AAA released today (thnak frick TAA didn't exist back then) AND it ran on a console that was underpowered and absolute shit even when it released in 2005.
you cannot be serious
I am serious.
>(thnak frick TAA didn't exist back then)
Ryse used it in 2013 on the Xbone, MGSV uses blurry FXAA without the strength of actually removing jaggies like TAA does.
TAA is dogshit and has permanently ruined game graphics.
TAA doesn't remove jaggies, it just blurs them.
The only way you remove jaggies is by increasing the resolution.
MSAA used to do that for specific edges, but it's no longer an option thanks to forward rendering being dead.
Art direction and having a happy healthy competnet staff will do that, VS treating it like a revolving door worker mill where it's all pajeets and danger hairs working for peanuts and being fired before you have to give them bonuses or benefits.
That and all the wonder coder types who used to make game engines are dying off and no one is replacing them, And any giga nerd who well and truly learned how to code isn't going to touch the pozzed to frick volatile decaying market that is the post 2007 gaming industry. They're all working in government or on academia or medical industry shit by now.
We are going to experience horrendous technical competence brain drain on unprecedented levels when the last of the carmack types and MT Framework coders who still care to participate in the industry die off. Because no one can make anything from scratch anymore- Just betty crocker "Code slop in a box" and pre made module and nodes.
>medical industry shit by now.
Is this lucrative for IT shit? I have a Biomedical Science Degree and an Information Systems Degree and I just work a shit IT job.
Batman Arkham Knight and Assassins Creed Unity look much better than every modern game.
its funny people still bring up unity
the biggest piece of shit game ever that required the best gpu from 2y ago to run at 15fps
It runs at 60fps on the Series S
>Add 30 layers of post processing
>Damn, why is the game running at 24fps?
>Fix it by running game at 25% resolution
>960x540 required to run 60 fps on medium on a 3070
L M A O
Are these the most unreasonable requirements so far this year? I swear devs have just been trying to outdo each other with how moronic they can get
theyre not unreasonable
it includes super old and low end gpus
yeah, I have zero idea wtf Ganker is on in this thread. Sometimes it feels like Ganker reacts based on their initial emotional reaction rather than what's actually reality.
its a rule that if a game launches on series S it cant require that much on pc either. The devs will always have the settings for series S to implement on pc too.
Like cities skylines 2 with its shit perf only launched on pc which enables it to have truly bad optimization
>super old
>3 years for the oldest gpu, same as current gen consoles
>low end
>$500 gpus having to run at 540p to reach 60fps
No wonder devs have no incentive to optimise, you'll literally defend anything
mental illness
Gollum takes the cake given just how shit that one looks.
>tfw GTX 980
It's ogre
>mfw GTX 970
>GPU
Remedy.... I want to love you but I cant....
no
not at all
Did people actually like Alan Wake? Why?
Because it's good.
Once again consoles have held back gaming by normalizing upscaling.
No, unfortunately consoles didn't normalize upscaling, Nvidia and their fanboys did.
Consoles have been doing upscaling shit since as early as 2008 with Halo 3/ODST.
This shit we see nowadays is recent, and came about with the rise of DLSS.
No, it's only with consoles using FSR for every title this gen that PC games have started optimizing around upscale.
>No, unfortunately consoles didn't normalize upscaling
look up playstation checkerboard rendering
dynamic resolution, checkerboard rendering and TAAU (which DLSS is) started heavily on Consoles, especially PS4 Pro
DLSS i believe was always a response to Sony claiming their $399 PS4 Pro could run games in 4K.
Nvidia was pushing for DLSS since the beginning.
>no physical to keep the game cost at 50€
>EGS exclusive due to contract
the road to hell is paved with good intentions etc etc
>nu games require upscaling
>they look worse than what’s already out for years
grim
My 3090 is already obsolete. I'll buy AW2 when I upgrade in a couple of years.
The thing that I hate about DLSS it makes characters looks like they're melting wax figures, granted the environments does look great I can't really tell the difference
>my PC barely meets low
Yeah old games can have decent looking color balance in a scene
they still look fricking shit when you get to a scene that involves animation, or god forbid facial animation with lipsync
That also applies to modern games so I fail to see your point.
>Alan woke too
Won't be getting it, but I'll be booting up Max Payne again with Kung Fu mod
>rx 7900XT
no fsr
>1080p
yep its gamign time
>Already time to upgrade my PC again
Shit, this thing is really only 4-5 years old, I mean it's got a 2060 Super in it, but the PSU is too weak to supply power to a 30XX series or 40XX series card, though it has a decent CPU in it, a Ryzen 7 3700x. My next PC is at least going to have a 30XX series card, really see no reason to get a 30XX series card, seems a bit overkill. I'll try and get a Ryzen 9. One day I'll have a PC with the top of the line components in it, at least for a few years until new tech comes out.
5-6 years ago a 2060 Super was considered in the top three GPU's, now it's almost worth nothing in modern gaming eyes.
>only 4-5 years old
>5-6 years ago a 2060 Super was considered in the top three GPU
Delusional.
30XX series cards didn't exist 5-6 years ago
A 2080 was basically top of the line, so I don't know what you're getting at. If you believe someone should upgrade their PC every 2-3 years then you're delusional, I don't have that kind of cash to do that every 2-3 years, also it seems wasteful. I'd at least try to salvage what I can out of the old one. My current PC is at least worth $1200 or so, that's a lot of money for me. I can't just go and buy a $2500 PC right out of pocket like that.
>30XX series cards didn't exist 5-6 years ago
Neither did 2060S, your timeline is just fricked.
I'm sorry to hear you can't just upgrade but pretending that mid-high end GPU from 4 years ago is anything but low end is lying to yourself.
If not for moronic GPU prices it would be great time to build a new PC, but it is what it is.
>5-6 years ago a 2060 Super was considered in the top three GPU's, now it's almost worth nothing in modern gaming eyes.
Buying anything xx80 and xx90 was considered "future proofing" and over the top back then for almost everything released at the time, now the latest generation of those cards is recommended for all this unoptimized shit.
>only 4-5 years old
lol
Hey, he should still be in the clear considering games look identical to what they looked like 4-5 years ago.
I spent $1200 on my last fricking month and I'm already unable to run ultra.
>"oh shit we have to design our game around DLSS!".
Maybe not the devs, but I can well see corporate thinking "AI saves us a bunch of money".
Even when you outsource to India, an hour of dev time easily costs the company $50. At just ten devs working optimization that's $4000 a day. With proper specialists in America, you easily pay 10 times eas much. DLSS can easily save you a six or seven figure amount of money. And it reduces your time to market.
Sure at AAA scales that only comes down to a fraction of a percent, but that's the stuff that makes your stock price go up. Simple matter of fact is that once you're market share is big enough it's just better business to ignore quality and instead go for margins.
Why yes. Yes it is.
unreal engine was a mistake
It's UNREAL how SHIT a fricking engine it is. Like we dog on unity but Unreal after... what the mid 2000s? is a sign that a game will be shit with god awful LOD, Performance and pop in and a no buy.
>those requirements
Why?
>1060 6GB
Guess I'm only playing Armored Core this year.
jews hate natives so much they are even trying to replace native resolution
I laffed then I frowned.
>4070 for high
>Epic Chink Store only for the moment
>Insert Mandatory Black person main character
Yeah frick no Timmy you chink slave
Okay those who are upset their 1060 cant run the game heres a solution
go to amazon
search for "4090 rtx"
Click "buy now"
>Buy a 4090
Unnecessary and overkill. a 3080 or 3090 is more than strong enough. You really only need a 40XX if you're doing animation work and design on your PC. The reason Alan Wake is requiring it for ULTRA performance is it because it's basically a cinematic movie game with over 10-15 hours of cutscenes I imagine.
You would only need a 40XX series GPU for very specific video games. a 3080 or 3090 would most likely run everything on ultra and ray tracing as well.
if you want nice path tracing experience at 1440p with dlss quality 4090 is the best gpu for that
I can already push my 4090 to its limits in a number of games and I plan on selling it for 1/2 the price I paid for it and getting a 5090 when the 5000 series launches in early 2025
>5000 series cards
>Look it up
>It's real
Holy fricking shit, didn't Nvidia come out with the 30XX series and then like 4-5 months later release the 40XX series, and NOW we have a 50XX series coming out anywhere from 8-15 months from now? Good god, slow down.
I MIGHT get a 40XX series, if I can afford to get one when I get my next PC, if not I'll settle for a 30XX series, though I see absolutely no fricking reason to have a 5000 series card when literally a small percentage of people don't even have 40XX series cards.
2025 is still 14 months away, and it'll likely release in the summer
Nvidia is moving to yearly architecture releases. They will likely follow the Intel release model of small improvements each gen but hey, bigger number more better.
>Nvidia is moving to yearly architecture releases.
Is that really necessary? Is technology advancing so rapidly in the gaming world? It's like Apple releasing a new Iphone every year but the only real improvement is the camera.
>Is that really necessary?
For their bottom line absolutely - the OEM market fricking LOVES that shit and that is a market nvidia will not ignore.
>Nvidia is moving to yearly architecture releases
no, they're not, it's the opposite in fact
they have traditionally released a new GPU generation every 2 years, but now they're moving to a 2 1/2 - 3 year gap between releases
>Nvidia is moving to yearly architecture releases
That's literally the opposite of what's happening. They're lengthening the gap between new generations, not shortening it.
>30XX series and then like 4-5 months later release the 40XX series
anon, there was two years between the debut of 30XX and 40XX
R-really? I swear they announced the 30XX series and it felt like 6 months went by and the 40XX series came out. It really didn't feel like TWO years.
you must have had pretty bad covid brain fog
Ray Tracing is a blight
>but it looks great
Not for the fricking cost difference.
>Okay those who are upset their 1060 cant run the game heres a solution
>go to amazon
>search for "4090 rtx"
>Click "buy now"
1060 is 7 years old, that's pretty much a whole console gen
I'm not necessarily upset, it's time to upgrade me and it served me well.
But I'm not upgrading for this shit.
I might let it get me through the ER DLC and any potential AC DLC that comes out. Once those games are done and anything in between I can run, I'll likely upgrade.
There are games with full RTX and no DLSS that run better than that on low.
>quake ii
>a game from 1997 that looked bad even then
>looked bad even then
Lies.
Doom Eternal deserves more recognition for how well optimized it is. It's the only modern game I can think of where running it at native with RT on is actually viable outside of the absolute top end GPUs. It looks 10 times better than half these smeared shit games as well.
RT Reflections are cheap, BFV had them in 2018 and i swear people forget.
>RT Reflections are cheap
And also the most impactful, because frick SSR.
What's up with how games used to save certain graphic effects for certain parts of a game or a cutscenes and now instead just enable everything all the time?
Workflows are way, way more complex now. When you have the performance you don't want to make your development needlessly complex. It also has better feeling of coherency.
2ish years between GPU/CPU generations has been the norm for the last decade or so? The odd thing with the 40 series was just how long it took to release every card, almost a year I believe. The 50 series should have been out at the end of 2024. AI boom screwed us over.
Only the 3090ti came out now two years ago. It was a special case since for 30 release we had a crypto boom and chip shortage. GPUs could literally print money back then, which is why the 3090ti double in price for like 5% extra performance.
And don't think that the series alone determines the performance. 4060ti and 3060ti are practically the same performance. Especially since the 40 series nvidia really screwed us over, many of those cards are actually a lower tier with a fake name and raised price. And they actually tried to bump everything up another tier with the "4080 12GB", but had to retract due to public backlash.
When you get your PC, head ove to /pcbg/ on Ganker, they'll help you navigate the bullshit.
I watched a GPU comparison video. The TI series is a meme and seems like a waste of money, I'm not getting a TI series card.
How old was that video, because that's really not true anymore. Ti started at the very top, then became more and more common because people associated it with high performance.
You used to pay premium to get the best, the flagship could cost 50% for just 10% more power. So the best card was midrange. But that's over now, with 40 series the 4090 actually had the best price/performance on launch. It's linear scaling now, no value or bargins to be had.
Also, GPU names are pure marketing. You absolutely cannot trust them, the sales/marketing teams will use any good reputation a card earned to screw you over with the successor.
I have never, ever checked a req list in my 28 years of gaming as I have been gaming on console since fricking snes days kek
enjoy upgrading your pcs every 2 years homosexuals
Enjoy 480p on your PS5
>90gb
>vaseline blur on everything
>2060 RTX minimum for 1080p WITH upscaling (aka 1267x712 native) on lowest settings
The frick are they using the drive space for? Did they make a full 4K TV show for this for the TVs they'll pepper about that look like old tube TVs which thus rendering that 4K resolution moot?
>The frick are they using the drive space for?
Voice acting, sounds, the textures you COULD be using.
This is going to be 480p 20fps on series s lol
480p, home.
Best girl Miyeon!
I'd say from an art point of view Deus Ex Mankind Divided also looks a lot better than modern games. The character models are bad but the general art direction is great.
On the plus side you can run PS4 and PS3 gen games in 4k 60fps max settings pretty easy now a days on cheap enough hardware.
This. Exactly the reason why Yakuza 8 will probably be the last AAA game I ever buy, it's still being made for the PS4. After that, I just don't care anymore. I don't want to need upscaling just to run games.
Yakuza 8 is not a AAA game, it's barely A level. Almost indie. Look at this shit. Barely better looking than a PS2 game.
That literally is more impressive than any game released in the past 5 years. Just having non-jaggy hair and liquid physics is more impressive than any AAA game.
>good hair
>solid clothing detail
>everything is clean about the appearance
>good depth of field blur
>stylized semi-realistic appearance instead of chasing photo-real
Looks better than most modern AAA.
Yakuza 6 is probably the best looking game on the PS4. Too bad it ran like shit so they downgraded their engine for all future titles. It runs so beautifully on even mid-range PCs.
RGG is at least AA and Sega is definitely a AAA publisher.
Rationally, unless the game heavily depends on spoilers or community involvment(e.g. MMO), there is no reason to buy a game newer than three years. On release games are just unfinished and overpriced.
Same thing true for hardware, the best you can probably do is get previous gen flagship a bit after the new stuff released. 3090s went as low as $500 middle of the year(literally every other PC component has great prices now, only GPUs suck).
I don't get how anyone could complain about these specs, you can literally get a 6700XT for 200€ used or 300€ new
It's the bare fricking minimum
hint they dont own even a 300€ gpu
also mental illness
also low iq because they live in 3rd world
I'm a neet, and i ain't dropping 300€ on a fricking GPU i'm going to have to replace in a year, because your incompetent diversity hires can't code, Remedy-shill.
Replace in a year? You mean 5 when the next remedy game comes out and runs medium/1440p on a 300€ GPU?
By not being a Eurotard who's used to getting fricked over on prices of literally everything.
Why would you want to play with FSR at 540p internal resolution, that's awful.
6700XT, to play at 720p with fsr? no thanks
shouldve bought nvidia
>1080p quality = 720p
>1440p balanced = 720p
>1080p performance = sub 720p
>2160p performance = 1080p
jesus....
rx 570
i will be there
Is.... is that a 200$ GPU in the recommended requirements...? IEEEEE Black personMAN SAVE ME
Ahh yes, the RTX 3070, a $200 GPU.
Fricking EuroBlack person homosexual.