My computer and monitors at max load use less than 600W.
Your masters that own you have private yachts that have 1 misc lightbulb that uses more 24h and 365 days per year.
No it isn't. It's a meme designed for morons to buy more expensive graphics cards.
NTSC was always 60fps/60hz. That's the standard. 120fps/120hz was only designed so movies could be played smoothly at 24fps. Think about that. They doubled the framerate so you could watch movies smoothly at a lower frame rate. >But that's le TV and console standards
Yes, which are the reason video games are still around. PC might have had faster refresh rates than 60hz but it was due to CRT technology. At the time most games were not as graphically thing as they are today.
So yes, you are being scammed. >But I can't go back to sub 240hz
That's not my problem. It's like you telling me you can't drive the speed limit anymore because you're so used to going over the limit just to get to McDonald's.
Hardware has advanced enough that not being able to achieve a consistent 1080p and 60 FPS is embarrassing.
The Switch is a potato. Tendies can cope and seethe.
It only matters if the game requires precise inputs like racing games and slashers do. But 30fps is perfectly fine like 80% of the time, especially considering animations (and sometimes game logic) are usually coded for either 30/60fps and look like shit on anything higher than that.
Only if the game is built around them, if it looks nice and keeps stable 30fps I'm fine with that. Also, guess it's a fringe opinion but I think FHD is excessive. I'm replaying SH3 right now and it looks super detailed (unless it's really close to camera) despite being 20+ years old and running in 480i. I guess lighting and texture work matter more in the end, that's why games that are even older look nice in high res while modern ones look boring and cluttered.
Not really, all higher resolution does is expose low res textures, and animations look weird at 60fps if the game was intended to run at 30fps or less. The only exception I can think of are N64 games because they had low res filtered textures and frequent framedrops to begin with.
Pixelated textures are kino and better than "realistic muddy".
>the animation looks weird
Brain rot.
4 months ago
Anonymous
Dunno, they just lack the smoothness you'd expect at 60fps and look a bit uncanny because of that, you can kinda see it here: https://youtu.be/uIwWezhf0xE
4 months ago
Anonymous
That's with hacks so ofc its glitchy. Like when you look at the Spark theres double the particle effects being spawned.
60 is always preferable but it is absolutely necessary for most games, hence why I'm entirely okay with 30fps too
Only if the game is built around them, if it looks nice and keeps stable 30fps I'm fine with that. Also, guess it's a fringe opinion but I think FHD is excessive. I'm replaying SH3 right now and it looks super detailed (unless it's really close to camera) despite being 20+ years old and running in 480i. I guess lighting and texture work matter more in the end, that's why games that are even older look nice in high res while modern ones look boring and cluttered.
>especially considering animations (and sometimes game logic) are usually coded for either 30/60fps and look like shit on anything higher than that
I can think of the Yakuza games where running the games above 30 will make the enemies able to block your combos very easily.
At 50 frames per second each frame is rendered every 1000/50=20ms.
So your total latency can be like 4ms for the mouse &keyboard, 20ms for the system and like 16ms for the monitor.
Add to that network lag if you're playing online and like server is not too far from you maybe 50ms ping.
So a total of 90ms(it can be worse with a bad mouse, wifi internet adapter, worse monitor,etc)
Thats 90ms between you moving the mouse and it moving serverside.
If you're around 250ms reaction time and enemy gets on screen into your fov, he will be there at t=0 for the server.
For you he will show on your monitor at t=90ms. If you react in 250ms to aim, you will be aiming at him at best at like t=340ms.
So now you click to shoot.
If you do not idealy press the mouse button the polling rate will "catch" your click maybe in the next frame.
So you are shooting at best at like t=430ms (as far as server and what you see on screen is concerned) at worst it can be closer to t=~500ms.
Half a second.
Thats how you can lose to a sharp boomer thats at 300ms reaction time but has a much better setup.
Those 50ms of network lag can only be reduced by moving closer to the server.
The peripheral input lag, system render latency and monitor latency can all be reduced to like 1ms for mouse, under 2ms (540 fps on a 540Hz monitor) for system and monitor.
for a total of like 55ms not 90ms.
And whats more important when you click the polling frame will come much faster and the server will get your input faster.
Consolehomosexuals the previous two gens >30fps is perfectly fine
Consolehomosexuals the start of this gen >anything less than 60fps is absolutely unacceptable in this day and age
Consolehomosexuals as they started to get more current gen games and less previous gen ports >30fps is perfectly fine
It's worse that that, display frequency depended on the alternating current frequency >US NTSC 60Hz >AC @60Hz >Yourope PAL 50Hz >AC @50Hz
PAL sucked so fricking much.
Sure but for cucksole games it meant that PAL versions were just slower, many games just ran at 5/6 speed of Jap or US version.
That included the sound.
If the game ran at 30FPS it would instead run at 25FPS in Yourope.
As mainly PC gaymer I didn't care much but it was staggering to visit friends with PS1, damn it was so choppy.
>many games just ran at 5/6 speed of Jap or US version
Not really, barely any games did. It was an easy fix to make, the only prominent game this affected was Sonic 1. This was corrected by the time Sonic 2 was released in 1992.
You've literally been believing urban myth shit for a long time then spreading around like a meme disease. You're a virus bro.
Atmosphere in games like Oblivion is very much different when you play it on a geforce fx5200 with the oldblivion mod that rewrites the shader model calls because the gpu was not officially supported, on a 17inch 1024x768 crt monitor with computer that strugles to do unstable 30 fps.
Then a year later you play on a 24 inch lcd 1080p tn panel and everything maxed out with a 8800gtx and c2quad 6600.
Its like two different games.
>Lower framerates look like shit.
No amount of motion blur and post-processing bullshit smearing can cover it up like real world / film's natural motion blur. >Lower framerates PLAY like shit.
Longer delays between frames means higher input latency because nearly every game ever does not register an input until at least the next frame rendered. >Lower framerates make in-game processes operate slower and shittier.
Because a lot of gamelogic is bound to the framerate, things like npc behavior and physics regularly get fricked up when there are bigger delays between two frames. It leads to buggy, inconsistent bullshit, especially if the framerate is inconsistent instead of a solid 60. >It was "excusable" in old generations like the NES where few games could actually handle 60FPS and you saw lots of flickering and hard slowdown.
The consoles of old were severely cheap-budget "home versions" that could barely provide a fraction of the quality of an arcade cabinet at the time, and they needed a lot of corners cut and goofy ass optimization to even run at 20FPS when showing more than four sprites on screen. This stopped being the case around the PS2 era which is why arcades have become a thing of the past. Yet despite far more powerful hardware, game devs instead occupy all those extra resources with bloated "quality" graphics instead of tuning for performance. The diminishing returns on observable-at-a-glance visual fidelity nearly plateaued about ten to fifteen years ago.
60FPS is the bare minimum acceptable rate. There is no excuse for a PS5bone game to not run flawlessly at 60FPS minimum.
Switch gets only the slightest leeway because of it being a typical cheap Nintendo handheld device.
PCs with high-end hardware made within the past decade or mid-range current gen hardware shouldn't even break a sweat pushing any game to 120+FPS.
A game that prioritizes "fidelity" instead of playing smoothly is a game made for shareholders who don't play games.
the fact that you all set an "acceptable rate" conveniently at one of the arbitrary standard shows that you really don't have any real standards at all. anyone who's had experience with variable framerate will tell you that decent framerate starts at 90-ish. 60 is acceptable, and can be played sure, but 90 is when it starts to feel good. it also depends on the game. an isometric rpg can be played at 60 with zero issues. 90 is my minimum for over the shoulder games.
>you really don't have any real standards at all
my standards are different than yours, such as playing Japanese games in moonrunes.
again 60 is preferable if possible, but 30 is entirely acceptable too.
60? lmfao. i play in 144 and can feel if it gets below 110. even 144 looks like it could be smoother. 60 fps is just a piece of shit for boomers with degraded eyesight and brain functions.
funf off.
>60fps
lol
because below such a threshold games become unresponsive.
good goyim! be sure to buy an RTX 5990 when it comes out!
Consoletard so moronic he doesn't know PChads can change the settings for more FPS
>more FPS
wasted if you're using an LCD
I only buy AMD hardware
120/144 is the bare minimum nowadays, most games I'm playing at 280 FPS now.
what ancient shit you play that goes 280 roflmao
Quake (actually at 1155 FPS but 280Hz obviously), War Thunder, Project Diva.
You're literally wasting energy and killing the planet
Good, frick society and frick the WEF.
>What are nuclear power and hydroelectricity?
Did you actually fall for Nintendo's ESG copium?
My computer and monitors at max load use less than 600W.
Your masters that own you have private yachts that have 1 misc lightbulb that uses more 24h and 365 days per year.
well kys to offset it
the planet deserves to die because it birthed Black folk and israelites
No it isn't. It's a meme designed for morons to buy more expensive graphics cards.
NTSC was always 60fps/60hz. That's the standard. 120fps/120hz was only designed so movies could be played smoothly at 24fps. Think about that. They doubled the framerate so you could watch movies smoothly at a lower frame rate.
>But that's le TV and console standards
Yes, which are the reason video games are still around. PC might have had faster refresh rates than 60hz but it was due to CRT technology. At the time most games were not as graphically thing as they are today.
So yes, you are being scammed.
>But I can't go back to sub 240hz
That's not my problem. It's like you telling me you can't drive the speed limit anymore because you're so used to going over the limit just to get to McDonald's.
>x is a meme
this is always sour grapes from poorgays.
I have a 240hz 4k projector and can play everything on ultra with mods.
X IS a fricking meme.
So is
HDR/Dolby Vision
Atmos
RTX
Anything above 2.1 or 3.1 sound
Anything above 1080p
Anything above 60hz/60fps
Ultra wide and curved
OLED
>Anything above 2.1 or 3.1 sound
yeah you may have a nice projector, but you clearly don't have a nice sound system.
The greatest films of all time are on mono soundtracks with artificial 7.1 audio. Think about that.
No, I don't need 7 speakers to hear the intricacies of a israeli actors' big nose in the wind.
Smoother motion is a meme? You're a moron
not denying 120 is better but 60 is acceptable to play a game smoothly
1. it is noticable
2.it is noticable
3. frick you homosexual
4. IT IS NOTICABLE
Hardware has advanced enough that not being able to achieve a consistent 1080p and 60 FPS is embarrassing.
The Switch is a potato. Tendies can cope and seethe.
>60 fps
Do console gays really
It only matters if the game requires precise inputs like racing games and slashers do. But 30fps is perfectly fine like 80% of the time, especially considering animations (and sometimes game logic) are usually coded for either 30/60fps and look like shit on anything higher than that.
1080p and 60 FPS are obviously noticable improvements compared to lower resolutions and framerates. There is no excuse for having less than them.
Only if the game is built around them, if it looks nice and keeps stable 30fps I'm fine with that. Also, guess it's a fringe opinion but I think FHD is excessive. I'm replaying SH3 right now and it looks super detailed (unless it's really close to camera) despite being 20+ years old and running in 480i. I guess lighting and texture work matter more in the end, that's why games that are even older look nice in high res while modern ones look boring and cluttered.
Even older games benefit from 1080p and 60 FPS.
Not really, all higher resolution does is expose low res textures, and animations look weird at 60fps if the game was intended to run at 30fps or less. The only exception I can think of are N64 games because they had low res filtered textures and frequent framedrops to begin with.
Pixelated textures are kino and better than "realistic muddy".
>the animation looks weird
Brain rot.
Dunno, they just lack the smoothness you'd expect at 60fps and look a bit uncanny because of that, you can kinda see it here: https://youtu.be/uIwWezhf0xE
That's with hacks so ofc its glitchy. Like when you look at the Spark theres double the particle effects being spawned.
60 is always preferable but it is absolutely necessary for most games, hence why I'm entirely okay with 30fps too
>especially considering animations (and sometimes game logic) are usually coded for either 30/60fps and look like shit on anything higher than that
I can think of the Yakuza games where running the games above 30 will make the enemies able to block your combos very easily.
*absolutely not necessary for most games
well shit my typo
this thread is full of morons
i grew up playing fortnite with 30fps and didnt complain
60? 75fps is minimum for me, 90 is preferred.
>four (fünf)
consolecope thread
At 50 frames per second each frame is rendered every 1000/50=20ms.
So your total latency can be like 4ms for the mouse &keyboard, 20ms for the system and like 16ms for the monitor.
Add to that network lag if you're playing online and like server is not too far from you maybe 50ms ping.
So a total of 90ms(it can be worse with a bad mouse, wifi internet adapter, worse monitor,etc)
Thats 90ms between you moving the mouse and it moving serverside.
If you're around 250ms reaction time and enemy gets on screen into your fov, he will be there at t=0 for the server.
For you he will show on your monitor at t=90ms. If you react in 250ms to aim, you will be aiming at him at best at like t=340ms.
So now you click to shoot.
If you do not idealy press the mouse button the polling rate will "catch" your click maybe in the next frame.
So you are shooting at best at like t=430ms (as far as server and what you see on screen is concerned) at worst it can be closer to t=~500ms.
Half a second.
Thats how you can lose to a sharp boomer thats at 300ms reaction time but has a much better setup.
Those 50ms of network lag can only be reduced by moving closer to the server.
The peripheral input lag, system render latency and monitor latency can all be reduced to like 1ms for mouse, under 2ms (540 fps on a 540Hz monitor) for system and monitor.
for a total of like 55ms not 90ms.
And whats more important when you click the polling frame will come much faster and the server will get your input faster.
Do you want four or five reasons? Or do you want four sets of five?
Zwëi + Zwëi = Fünf
1. looks like shit
2. plays like shit
3. shows your game would rather be a movie
4. shows lack of passion or budget
Consolehomosexuals the previous two gens
>30fps is perfectly fine
Consolehomosexuals the start of this gen
>anything less than 60fps is absolutely unacceptable in this day and age
Consolehomosexuals as they started to get more current gen games and less previous gen ports
>30fps is perfectly fine
Near the end of generation
>human eye can't see more than 24fps anyway
>arbitrary
60 fps standard wasn't arbitrary, it was dictated by display technology.
It's worse that that, display frequency depended on the alternating current frequency
>US NTSC 60Hz
>AC @60Hz
>Yourope PAL 50Hz
>AC @50Hz
PAL sucked so fricking much.
The total lines of resolution in PAL are 625, which is higher than NTSC. So, you can say thatthe picture quality of PAL is better compared to NTSC.
Sure but for cucksole games it meant that PAL versions were just slower, many games just ran at 5/6 speed of Jap or US version.
That included the sound.
If the game ran at 30FPS it would instead run at 25FPS in Yourope.
As mainly PC gaymer I didn't care much but it was staggering to visit friends with PS1, damn it was so choppy.
>many games just ran at 5/6 speed of Jap or US version
Not really, barely any games did. It was an easy fix to make, the only prominent game this affected was Sonic 1. This was corrected by the time Sonic 2 was released in 1992.
You've literally been believing urban myth shit for a long time then spreading around like a meme disease. You're a virus bro.
It wasn't common but it's more common than you think.
NES games were like that all the time.
it's not an arbitrary threshold so I'm not even going to entertain your bad argument.
shouldn't this thread be moved to /tech/
specifications have zero impact on game quality's
not entirely true, play any fighting game at 30FPS, the difference in how it feels to play is staggering.
Atmosphere in games like Oblivion is very much different when you play it on a geforce fx5200 with the oldblivion mod that rewrites the shader model calls because the gpu was not officially supported, on a 17inch 1024x768 crt monitor with computer that strugles to do unstable 30 fps.
Then a year later you play on a 24 inch lcd 1080p tn panel and everything maxed out with a 8800gtx and c2quad 6600.
Its like two different games.
1. it makes console Black folk mad
2. it makes console Black folk mad
3. it makes console Black folk mad
4. it makes console Black folk mad
PCs are for office work
Consoles are for childs play.
Fünf is five you moron
That’s the German word for 5
If you can't hit 60 fps in games you have bigger issues. Like living below the poverty line.
>Lower framerates look like shit.
No amount of motion blur and post-processing bullshit smearing can cover it up like real world / film's natural motion blur.
>Lower framerates PLAY like shit.
Longer delays between frames means higher input latency because nearly every game ever does not register an input until at least the next frame rendered.
>Lower framerates make in-game processes operate slower and shittier.
Because a lot of gamelogic is bound to the framerate, things like npc behavior and physics regularly get fricked up when there are bigger delays between two frames. It leads to buggy, inconsistent bullshit, especially if the framerate is inconsistent instead of a solid 60.
>It was "excusable" in old generations like the NES where few games could actually handle 60FPS and you saw lots of flickering and hard slowdown.
The consoles of old were severely cheap-budget "home versions" that could barely provide a fraction of the quality of an arcade cabinet at the time, and they needed a lot of corners cut and goofy ass optimization to even run at 20FPS when showing more than four sprites on screen. This stopped being the case around the PS2 era which is why arcades have become a thing of the past. Yet despite far more powerful hardware, game devs instead occupy all those extra resources with bloated "quality" graphics instead of tuning for performance. The diminishing returns on observable-at-a-glance visual fidelity nearly plateaued about ten to fifteen years ago.
60FPS is the bare minimum acceptable rate. There is no excuse for a PS5bone game to not run flawlessly at 60FPS minimum.
Switch gets only the slightest leeway because of it being a typical cheap Nintendo handheld device.
PCs with high-end hardware made within the past decade or mid-range current gen hardware shouldn't even break a sweat pushing any game to 120+FPS.
A game that prioritizes "fidelity" instead of playing smoothly is a game made for shareholders who don't play games.
>60FPS is the bare minimum acceptable rate
No 30fps is entirely fine for most games too.
t. old PC user who now plays on Switch and Steam Deck
the fact that you all set an "acceptable rate" conveniently at one of the arbitrary standard shows that you really don't have any real standards at all. anyone who's had experience with variable framerate will tell you that decent framerate starts at 90-ish. 60 is acceptable, and can be played sure, but 90 is when it starts to feel good. it also depends on the game. an isometric rpg can be played at 60 with zero issues. 90 is my minimum for over the shoulder games.
>you really don't have any real standards at all
my standards are different than yours, such as playing Japanese games in moonrunes.
again 60 is preferable if possible, but 30 is entirely acceptable too.
60? lmfao. i play in 144 and can feel if it gets below 110. even 144 looks like it could be smoother. 60 fps is just a piece of shit for boomers with degraded eyesight and brain functions.
30 fps is almost unplayable
30 fps looks like dogs but
30 fps makes me mad
30 fps is garbage