The average console pleb doesn't even know what you're talking about, let alone give a frick about it. And why isn't 4k standard yet? and in fact costs $2000 to run new games at max settings on PC.
I'd say were 5-10 years or more away from all this being standard.
fps is subjective, some people really don't notice the difference. but visual fidelity is objective, anyone with a working set of eyes can tell the difference between a screenshot with pretty lighting and one without.
fps is objective you moronic idiot
at best perception of motion smoothness varies among people but that's not subjective
learn what the word means illiterate frick
me liking red and you liking blue is subjective
me having high iq and you having low iq and being unable to see certain patterns is not subjective
>fps is objective you moronic idiot
It's absolutely not. I personally know 2 people(one is 15 the other is 22) who can not tell the difference between 30 fps and 144 fps in motion and i think there are a lot more people like those.
Most games don't need 60fps.
Back in the day when 3D was invented people played at 25 fps and noone cared.
I wish today 40fps is the targeted minimum though.
Because along with the year, customer expectations increase as well. Higher framerates are just for mid-gen "pro" consoles, the cross-gen period, PCs, and games that they can budget it. Every 5 years it's the same song and dance.
because theyre so small, if you want 60+fps you need a pc with a mobo, a giant ass graphics card and a decently sized cooling system and fit all the rest of the stuff in it while keeping the size and costs less than a pc tower
i think graphics cards have tripled in size over the last 20 years
t.boomer
A lot of them do run at 60fps, although that will likely change due to morons screeching that cross-gen releases shouldn't be happening this late in the generation and that we need MORE GRAPHICS, MORE RAY TRACING OH SHIT THIS LIGHTING IS SO REALISTIC I'M GONNA FRICKING COOM.
The only reason we have decently running games on PS5 and SeX is because of devs being tether to the ancient PS4 and Xbone.
They're trying to use raytracing and 4k** and consoles that can't handle it.
Because most movies are filmed at 24fps.
graphics over everything heh
Bespoke
Everyone uses this word wrong.
It's supposed to mean "custom-built one-off" but people use it to refer to stock standard retail products.
bespoke x360 ati R420 based hardware
It's highly bespoken of in certain circles.
It's bespoke by the manufacturer for the product developer though
The average console pleb doesn't even know what you're talking about, let alone give a frick about it. And why isn't 4k standard yet? and in fact costs $2000 to run new games at max settings on PC.
I'd say were 5-10 years or more away from all this being standard.
It's because console manufactures stoped making consoles with loss and ern the money from games sells
fps is subjective, some people really don't notice the difference. but visual fidelity is objective, anyone with a working set of eyes can tell the difference between a screenshot with pretty lighting and one without.
fps is objective you moronic idiot
at best perception of motion smoothness varies among people but that's not subjective
learn what the word means illiterate frick
me liking red and you liking blue is subjective
me having high iq and you having low iq and being unable to see certain patterns is not subjective
>fps is objective you moronic idiot
It's absolutely not. I personally know 2 people(one is 15 the other is 22) who can not tell the difference between 30 fps and 144 fps in motion and i think there are a lot more people like those.
That's like saying speed is subjective because you know an amputee and someone in a wheelchair
You fricking sperg
30 fps is fine as long as it's 30fps 100% of the time
>30fps game has a 0.1% frametime fluctuation to 29 and 31fps
I wouldn't recommend playing the game at this broken state, it is unfortunate.
>30 fps is fine
No it isn't, ever. Get better standards you peasant
we made the jump from 1080 to 4k too quickly.
console players have been conditionned to enjoy 30fps, they think anything over that is useless.
Sometimes he jumpscares me while pointing out fine graphical detail and then suddenly cutting to a closeup of him talking.
Because consoles are used for warming your food. Not playing games, silly OP.
Because only shitters need more
why john be rizzin tho
Most games don't need 60fps.
Back in the day when 3D was invented people played at 25 fps and noone cared.
I wish today 40fps is the targeted minimum though.
Because along with the year, customer expectations increase as well. Higher framerates are just for mid-gen "pro" consoles, the cross-gen period, PCs, and games that they can budget it. Every 5 years it's the same song and dance.
Wait, isn't PS5 60fps as standard?
Are PS5/Series X games still 30fps?
because theyre so small, if you want 60+fps you need a pc with a mobo, a giant ass graphics card and a decently sized cooling system and fit all the rest of the stuff in it while keeping the size and costs less than a pc tower
i think graphics cards have tripled in size over the last 20 years
t.boomer
They're 10 - 15 times over that size.
A lot of them do run at 60fps, although that will likely change due to morons screeching that cross-gen releases shouldn't be happening this late in the generation and that we need MORE GRAPHICS, MORE RAY TRACING OH SHIT THIS LIGHTING IS SO REALISTIC I'M GONNA FRICKING COOM.
The only reason we have decently running games on PS5 and SeX is because of devs being tether to the ancient PS4 and Xbone.
>most
It's more like a minority of them run 30fps, such as Shitfield.
Most multiplayer games run at 120fps and the majority of single player games run at 60fps.