do you consider it sacrilege to play games at resolutions that where not possible when they launched? jjwyvj
It's All Fucked Shirt $22.14 |
DMT Has Friends For Me Shirt $21.68 |
It's All Fucked Shirt $22.14 |
do you consider it sacrilege to play games at resolutions that where not possible when they launched? jjwyvj
It's All Fucked Shirt $22.14 |
DMT Has Friends For Me Shirt $21.68 |
It's All Fucked Shirt $22.14 |
Man, just do whatever you want. Who fricking cares
Women for one. Some other people too
No. Doing whatever I want has been nothing but disastrous for me. Frick that. Doing whatever you want is tantamount to saying have a nice day since you're basically saying I don't care if you do heroin and die. So yes, I will tell people what to do, because I dont want them to die on the streets.
retro video games?
I consider the fricked up lighting in the EE to be a greater sacrilege.
sucks we dont have a better source port system, best we got is the Portable version which is just dosbox with hacks
They published the source code on github, just nobody is interested in it.
No but I do feel dirty if the aspect ratio isn't 4:3 when I'm playing PS1 games. Not so much for PS2, but sometimes.
Depends on the game
Nah. Even game companies add modern resolutions support for old games. They would look like shit without it.
No. I'm playing Blood on 1920x1680 with a mouse right now
on the contrary, I think it's really great to see those games in higher resolutions.
t. used to play Duke3D in 320x200
If the presentation relies on a certain visual aesthetic approach then I would want to play it as originally intended (including CRT shaders).
If there are obvious QoL improvements due to the limitations of the original platform that for instance made the framerate stutter or some buggy shit got improved then by all means I'd go with the upgrade.
RTS and strategy games in general are improved quite a bit with higher-rez patches.
(OP)
Doing this is okay if it doesn't look like garbage and the game balance isn't significantly altered. In a 2D game, a wider view of the map or whatever may be a simple convenience or a substantial difficulty reduction. In an old 3D game, smoothing jagged edges tends to emphasize how simplistic the 3D models are, which makes things ugly. In general, you'd have to be stupid to think increasing resolution anachronistically is a good idea, but if you like it then whatever, you might as well do it and make your stupid person's life a little bit less painful. You're probably not ever going to stop being so stupid, after all.
Only games with pre-rendered backgrounds and a fricked up art style suffer from higher resolutions.
The sampling frequency (complexity) of output image should match the sampling frequency of objects (models, sprites, UI elements, etc.) and the sampling frequency of textures if they are used. Increase one without increasing the rest, and you get the amateurish look of early 2000s C tier games.
Simple as.
>early 2000s C tier games
MGS2 is an early 2000s game, the most games with fricked up art style came from PC
>upscaling 2D sprites made to look 3 dimensional
why
What the frick are you even trying to communicate?
The primary issue with this topic can be boiled down to "pixels don't scale well" and everyone now has such big monitors that a stretched downscaled image doesn't cut it for anyone.
I don't care so long as it doesn't frick up the UI
only if it fricks up the game in some way
Nah, though there're times it can make details clearer than they were supposed to be and spoil the atmosphere a bit by making it more clear that some elements are 2D, for example, or where the level ends and the skybox begins. But generally when I emulate stuff I try to keep it as close to how it would be on the original hardware as possible, but resolution is the exception, I crank that shit all the way up.
this question is really only valid to 2D games
There is little difference between 2D and 3D. People who run old 3D games at modern native screen resolutions just because the graphical subsystem was made forward compatible, and now reports it to the game as maximum available one in the usual manner, are idiots. Moreover, even when games break, which means no developer has originally tried to run them in higher resolutions even once, they don't see that as a reason to stop, and make patches. That you can do something does not mean you should do that.
Some modern 3D games with complex multi-stage multi-resolution adaptive rendering are probably resolution-agnostic enough, but that's a small fraction of them.
barely any games break because of higher res, and devs can simply not care as much for pc versions as I suspect is usually the case
I am talking about old native PC games. Some of them break. Many of them have nonsensical UI in high resolutions. Most of them look like shit in high resolutions. And developers were not stupid or lazy. At the release time there was simply no hardware capable of running those game this way, even for the laughs, so they didn't have to prepare the code or the content for them, even theoretically. If you have a full list of resolutions of a modern video card in game settings, it is not a sign it was meant to be run this way.
Those who run modern ports of games like Quake 2 might not even understand the list of resolutions was static in the original release, and was altered or bypassed by port authors.
No, as long as the person doing it admits he is playing a modded version of the game.