Yeah, once they fix ghosting problem, upscaling aliased frames problem and problem of frame generation being useless shit maybe I'll be ok with it. Right now it is just a permission for shitty devs to not optimize their games properly
No. Games that run in native resolution will always exist. AI-reconstructed images will become unpopular as people question why small indie teams make better looking games that run faster, yet again.
What is with this new age of tourists who don't understand >see (post)
is to point to posts made before you and not after you? That's something like the 8th time I've seen it this week.
"see X" or "refer to X" or maybe "reiterating X post" etc. are common English parlance you fricking ESL or shitbrained 2-language and bad at both spicsican
Rasterization IS fake. TAA + Rasterization is as fake as any frame can be. Shadowmaps are fake. Cube maps are fake. The ship has sailed a long time ago when it comes to authenticity.
It's pretty obvious that NVIDIA hit a dead end when it comes to rasterization which is why they are putting all their faith into DLSS which effectively also forces people to upgrade gen to gen if they want the newest features.
i just want to add that jacket man and his company have nothing but utter contempt for gamers. they don't care about you . they hate you. they wish all their customers were wall street types with millions of dollars to spend and racks of servers to fill.
ai upscaling is a neat tool to use on older games.
using it because you're too lazy or cheap to make assets for a game you are actively developing is cringe
>native resolution in games no longer matters >also don't forget to buy the 5090 with DLSS4 to remove 40% of the ghosting of DLSS3.5 from running games upscaled from 1080p in order to barely keep the ray tracing titles playable barely above 60fps on our flagship card
>volumetric lighting + ssr + cube maps look about 80 percent as good >no let’s tank our performance to a quarter and use it that way 🙂
I hate that these companies have extended development times and bloated budgets and still take every fricking shortcut
>we could spend money on making better cards >but you homosexuals will buy anything with our logo stuck to it >here's some new upscaling bullshit to keep the goyim busy for a while >lmao, if they say they're changing to amd just tell them the drivers are bad and screens go black or some shit like that, regardless of the fact their drivers are much more advanced than ours but who gives a frick, you're not poor are you? lmaaaaaaaaaaaao
Fake resolutions. Fake games.
Not buying unoptimized westacuckolded dogshit ever again. Sorry not sorry.
Lies of P looks gorgeous yet it runs buttery smooth
It was made by a relatively small South Korean studio and it's their first single player game. They previous worked for an overhaul/console port of Bless Online.
Bright Memory Infinite looks gorgeous yet it runs buttery smooth
It was made by a tiny indie Chinese studio - mostly one guy who made the prequel (demo).
>plus the bad engines (aka unreal)
DLSS' deformed poster child is troonypunk (also in the OP) and the recently released AAA turd that looks and performs even worse is Slopfield. Both of these games are on their own engines: REDengine and Creation Engine respectively.
Meanwhile Unreal works like
Fake resolutions. Fake games.
Not buying unoptimized westacuckolded dogshit ever again. Sorry not sorry.
Lies of P looks gorgeous yet it runs buttery smooth
It was made by a relatively small South Korean studio and it's their first single player game. They previous worked for an overhaul/console port of Bless Online.
Bright Memory Infinite looks gorgeous yet it runs buttery smooth
It was made by a tiny indie Chinese studio - mostly one guy who made the prequel (demo).
, in the hands of competent developers and not the subhumanly moronic diversity hires you have in the west.
If how graphics work stayed the same as 2003-2005n by now we'd have physical hardware capable of 8x MSAA with spatial and depth aware filters in real time.
But graphics don't stay the same, so
Rasterization IS fake. TAA + Rasterization is as fake as any frame can be. Shadowmaps are fake. Cube maps are fake. The ship has sailed a long time ago when it comes to authenticity.
makes a good point, and
Fake resolutions. Fake games.
Not buying unoptimized westacuckolded dogshit ever again. Sorry not sorry.
Lies of P looks gorgeous yet it runs buttery smooth
It was made by a relatively small South Korean studio and it's their first single player game. They previous worked for an overhaul/console port of Bless Online.
Bright Memory Infinite looks gorgeous yet it runs buttery smooth
It was made by a tiny indie Chinese studio - mostly one guy who made the prequel (demo).
>being so moronic you've probably both posted about graphics realism and also posted about "muh optimization" in the same day dozens of times
just, fricking, lmal,
you clowns are moronic and will never understand
AMD fricked up so badly, Nvidia went uncontested for practically 4 generations now. The gap is so wide, that AMD can't make it up; which leads to Nvidia setting the standard. So yeah, they're right.
>journalists: you no longer need high resolution monitors! >high resolution monitor sales plummet >journalists: sorry nvidia, nothing beats having a 4k display.
>Were they right?
No they're not fricking right, we're witnessing the dying gasps of an industry that is literally incapable of sustaining itself. The Nips, asiatics, and Chinks seem to have this mostly figured out, but Western companies literally cannot fathom the concept of optimization. >Games are bloating up to the 100+GB mark regularly >Games cannot run on modern >The solution to the above problem is to cheat the resolution, creating a product that looks infinitely worse than just playing at a lower resolution because you introduce vomit-inducing blur, ghosting, and visual inconsistency. >Everything is being stuffed into poorly optimized jack-of-all-trades engines >Everything is being outsourced to Poos and Chinks with absolutely zero QA.
Do not purchase AAA slop, do not give your time to AAA slop, do not waste disk space on AAA slop, do not engage with people who play AAA slop, stop watching "content creators", stop playing GaaS, stop rolling for .JPGs and just jerk off to the ripped images.
Fricking. Stop. And watch as everything falls the frick apart.
Same for me, though I still have PC so can play lots of stuff old and new. I feel Dragon's Dogma 2 is my last big game because it feels like a struggle to stay on top of hardware requirements.
at least shadows are looking crisp again, i still can't believe they can't do better lighting than a fricking chronicles of riddick game or fear or any fricking game that had programmed dynamic lighting
In the near future we will be able to play old gb games like Pokemon Silver but have AI upscaling them to Red Dead Redemption 2 levels. We’ll be able to tell the AI even what features we want added to the game. Eventually it will get so good we won’t even need games anymore, we’ll just tell AI what sort of game we would like to play and it will make it for us. Then it will make us forget about our past
Ps4pro was able to render 4k30fps with checkerboard rendering(50% of pixels rendered). PS5 is able to render dynamic resolution 4k(4k-1080p range), every frame of which is upscaled by FSR performance(50% scale, which is 25% of pixels). Which means that while ps4pro does native ~1530p 30fps, ps5 can only do 540p-1080p render at 30 fps. Considering that graphical fidelity did not increase much since ps4pro era, and ps5 is at least 2 times stronger(raw power), I can only conclude that developers are massive fricking Black folk and stopped optimizing their games because FSR became available
I'd say one of the structural problems with gaming today is that tools have gotten better over time, and hardware has improved as well, so now they can crank out GARGANTUAN games with awful memory leak products way, way too much bloated memory use, horrible optimization and incomplete features, and the games will actually "work" well enough that they can then patch the game up to functionality without being lynched by the shareholders.
It used to be a thing that game designers were essentially coders first, having done stuff besides making games where the tolerances for failure are comparatively low. If some company hires you to write them a program that does X and it doesn't work, you're in fricking trouble, so all of the people who worked at Id for example were veterans from industrial programming and shit, people who actually knew what they were doing, you would get HEROIC efforts by these 4-digit IQ people to scrape 50 kilobytes off of a program or to reach the extraordinary heights of 12 or 13 fps and people would call you Nostradamus for doing it.
Because the technology is so much better now the artisan skill required is vanishing, sort of like how there's nobody working at Disney, or even in the entire fricking film industry who COULD produce a traditional animated film because nobody at Disney has held a Pencil since Treasure Planet flopped. Nobody actually knows how to code anymore.
AI is just a logical continuation of that process, but there's a hidden benefit, which is that eventually they'll figure out how to train an AI to clean up the code of the 80 IQ pajeets they hire to do the grunt work on games these days and suddenly everything will be hyper-optimized.
>eventually they'll figure out how to train an AI to clean up the code of the 80 IQ pajeets they hire
The problem with that statement is that someone needs to check after ai, that the project compiles and works properly, and fix bugs created by code cleaner. And they will just hire 80iq pajeet for that position that will put all the shit back in
That's certainly a possibility, but I think what's more likely is that a professional class of people whose job it is to wrangle AI will arise, sort of like coding-shamans, who get called in to train up some proprietary model to fix all of the optimization issues get rolled out of whichever slave plantation in Lebanon the company hired to do the foundational coding.
you people are just Luddites, get on or get dragged kicking and screaming onto the future.
The Luddites were literally right about every single thing they claimed. Look up the actual story of the Luddites, the government fricking killing people for thinking hey maybe these giant textile factories where we have a big sign that says "days since an 8 year old was sucked into the automatic loom" has said 0 since the day the factory opened is BAD.
This. Anyone using Luddite as an insult or condemnation is incurably moronic.
They were middle class professionals who worked from home and they were literally murdered by rich people for trying to maintain that lifestyle instead of going quietly into an employee relationship that compares unfavorably with actual slavery.
Someone has been ignoring the entire history. We are on our way to collapse, ask literally every historian.
Rome has to burn down one day, and there will never be another Rome ever again.
After few centuries there will be great times again, but its not going to Rome or anything like now as well, something different, we will be long gone.
kindof. Most people don't even notice the difference between 4k and 1080p after a few minutes. If the fake resolution makes them think they're getting something better than they are then it has done its job.
Kind of feels like 1080p is going to be the gold standard, like 60 fps. The diminishing returns is really apparent and it's just more resources for marginal returns. Yes, I know you spend a lot of money on your rig, 120 is a meme, shut the frick up. 4k is barely an upgrade that up scaling a 1080p image is the most optimal way to get a 1080p source to look pretty good on a 4k screen. Swear the 4k/120 chuds are the same types of people that think gold plated HDMI cables produce the best results. The only nice thing is I invested in 4k monitor and if it ever dies, can just go back to 1080p cause boy was that a gamble that failed.
I'm not buying a GPU with raytracing. If I want raytracing I should have special dedicated hardware, I shouldn't have to pay for something I'll never use.
Same goes for "neural" units.
I only play real games where the rendering is 100% controlled by the game developers, not some generic driver-provided filter.
Sometimes i forget who i'm talking to, and engage with consolewar gays. But then someone like you comes by to remind me just how pointless it is to discuss anything with literal morons.
Stay poor. Stay dumb.
Perhaps. You'll be able to hit native 4K240 in Phantom Liberty one day, but by that time, more complex game worlds with much higher demands will exist.
Sort of.
Fine by me, I suppose.
I have a 4090 and several 4k screens. I play everything at 1080p
Yeah, once they fix ghosting problem, upscaling aliased frames problem and problem of frame generation being useless shit maybe I'll be ok with it. Right now it is just a permission for shitty devs to not optimize their games properly
If you play at 4K, sure.
>fake frames
>fake resolutions
>fake reviews
>fake promises
>fake hype
>fake progression
yeah just crash it at this point
Yeah, time to kill the console, which always holds the gaming back. see
No. Games that run in native resolution will always exist. AI-reconstructed images will become unpopular as people question why small indie teams make better looking games that run faster, yet again.
What is with this new age of tourists who don't understand
>see (post)
is to point to posts made before you and not after you? That's something like the 8th time I've seen it this week.
lol wut. That anon just adds something to your post by doing that. His post is valid as yours.
"see X" or "refer to X" or maybe "reiterating X post" etc. are common English parlance you fricking ESL or shitbrained 2-language and bad at both spicsican
Rasterization IS fake. TAA + Rasterization is as fake as any frame can be. Shadowmaps are fake. Cube maps are fake. The ship has sailed a long time ago when it comes to authenticity.
What I hope for every year and it never happens.
No. Meme acronyms like DLSS, FSR, and RT are not worth bothering with yet. I do hope they improve on it though.
No, since a lot of developers apparently see it as a tool to not optimize their games.
death to techbros and AI slopgays
It's pretty obvious that NVIDIA hit a dead end when it comes to rasterization which is why they are putting all their faith into DLSS which effectively also forces people to upgrade gen to gen if they want the newest features.
they didn't hit a dead end, they just stopped making video cards. now they make AI cards that just happen to do video as well.
i just want to add that jacket man and his company have nothing but utter contempt for gamers. they don't care about you . they hate you. they wish all their customers were wall street types with millions of dollars to spend and racks of servers to fill.
ai upscaling is a neat tool to use on older games.
using it because you're too lazy or cheap to make assets for a game you are actively developing is cringe
>rubinsky
If you have an nVidia card in your computer, this is your fault.
The developers are getting lazy af.
>native resolution in games no longer matters
>also don't forget to buy the 5090 with DLSS4 to remove 40% of the ghosting of DLSS3.5 from running games upscaled from 1080p in order to barely keep the ray tracing titles playable barely above 60fps on our flagship card
>here is your future
>that cup disappearing
kek, AI can't eve do object permanence
Looks great though?
>But if you zoom super far in it looks almost as bad as fake rasterized graphics
I was playing starfield with DLSS 3.5 and it had that ghosting effect without any zooming
>what is ghosting
You dont need to zoom in for that
this is an xbox s or ps4 pro, isn't it
a.k.a. not my problem, poorgay
That's a 4090
WHY WOULD YOU USE A MOVING IMAGE TO SHOW SUBTLE DIFFERENCES
>SUBTLE
it's smearing in real time, that's how it looks when you play
in both the before and after frames everyones legs are smearing or casting a black aura around their calves
yes, that's what DLSS is like and why everyone called it shit
?t=802
almost every instance has ghosting around moving objects
>when the order of your spline >>> n
HD graphics and its consequences have been disastrous
Isn't that fake frames and fake gaming?
Is it the future yet?
Yes
>volumetric lighting + ssr + cube maps look about 80 percent as good
>no let’s tank our performance to a quarter and use it that way 🙂
I hate that these companies have extended development times and bloated budgets and still take every fricking shortcut
Thanks to the console influence.
did you even read the tweet you PCBlack person
it's far worse on PC than it will ever be on ChadStation
>tweet
It's called a xeet now
>we could spend money on making better cards
>but you homosexuals will buy anything with our logo stuck to it
>here's some new upscaling bullshit to keep the goyim busy for a while
>lmao, if they say they're changing to amd just tell them the drivers are bad and screens go black or some shit like that, regardless of the fact their drivers are much more advanced than ours but who gives a frick, you're not poor are you? lmaaaaaaaaaaaao
I just bought a 7800 XT and I wish I had bought a 4080 instead, even though I'm on Linux.
i bought a 4090 and i wish i had bought a 7900xtx
Exchange, tards
Fake resolutions. Fake games.
Not buying unoptimized westacuckolded dogshit ever again. Sorry not sorry.
Lies of P looks gorgeous yet it runs buttery smooth
It was made by a relatively small South Korean studio and it's their first single player game. They previous worked for an overhaul/console port of Bless Online.
Bright Memory Infinite looks gorgeous yet it runs buttery smooth
It was made by a tiny indie Chinese studio - mostly one guy who made the prequel (demo).
the highest the base resolution, the better it looks even when upscaled, so thats not true
otherwise youd just be rendering at 24p then upscaling
And the higher the base resolution the worse the performance, which was the entire point.
I thought the glorious PCMR despises console upscaling?
They made fun of consoles for years because of it.
They do, but It is just DSR - down scaling, not DLSS (upscaling)
Too bad, many developers are fricking lazy, plus the bad engines (aka unreal) will not be helpful at all.
>plus the bad engines (aka unreal)
DLSS' deformed poster child is troonypunk (also in the OP) and the recently released AAA turd that looks and performs even worse is Slopfield. Both of these games are on their own engines: REDengine and Creation Engine respectively.
Meanwhile Unreal works like
, in the hands of competent developers and not the subhumanly moronic diversity hires you have in the west.
>downscaling
It will choke the consoles to death instantly.
If how graphics work stayed the same as 2003-2005n by now we'd have physical hardware capable of 8x MSAA with spatial and depth aware filters in real time.
But graphics don't stay the same, so
makes a good point, and
>being so moronic you've probably both posted about graphics realism and also posted about "muh optimization" in the same day dozens of times
just, fricking, lmal,
you clowns are moronic and will never understand
AMD fricked up so badly, Nvidia went uncontested for practically 4 generations now. The gap is so wide, that AMD can't make it up; which leads to Nvidia setting the standard. So yeah, they're right.
It'd be nice if DLSS evolves into DLDS.
>journalists: you no longer need high resolution monitors!
>high resolution monitor sales plummet
>journalists: sorry nvidia, nothing beats having a 4k display.
>the company shilling AI in graphics says AI in graphics is the future
not even you can be this moronic, OP
their meme stock increase is insane, no wonder they want to keep pushing the AI meme
>Were they right?
No they're not fricking right, we're witnessing the dying gasps of an industry that is literally incapable of sustaining itself. The Nips, asiatics, and Chinks seem to have this mostly figured out, but Western companies literally cannot fathom the concept of optimization.
>Games are bloating up to the 100+GB mark regularly
>Games cannot run on modern
>The solution to the above problem is to cheat the resolution, creating a product that looks infinitely worse than just playing at a lower resolution because you introduce vomit-inducing blur, ghosting, and visual inconsistency.
>Everything is being stuffed into poorly optimized jack-of-all-trades engines
>Everything is being outsourced to Poos and Chinks with absolutely zero QA.
Do not purchase AAA slop, do not give your time to AAA slop, do not waste disk space on AAA slop, do not engage with people who play AAA slop, stop watching "content creators", stop playing GaaS, stop rolling for .JPGs and just jerk off to the ripped images.
Fricking. Stop. And watch as everything falls the frick apart.
AI homosexuals literally asked for this. Reap what you sow.
>you need a $2000 GPU just to play games in 1080p medium settings
frick this. I'll stick with my PS4 Pro and just quit gamines. it's not fricking worth it and modern games don' really interest me.
Same for me, though I still have PC so can play lots of stuff old and new. I feel Dragon's Dogma 2 is my last big game because it feels like a struggle to stay on top of hardware requirements.
>I'll stick with my PS4 Pro and just quit gamines
Sounds like you never started gaming in the first place.
THIS ISN'T WHAT AI WAS SUPPOSED TO BE. WE WERE PROMISED ANIME breasts, NOT STRETCHED OUT 720P
at least shadows are looking crisp again, i still can't believe they can't do better lighting than a fricking chronicles of riddick game or fear or any fricking game that had programmed dynamic lighting
>have RX 6950 XT
>Games look shitty AND run at ~80fps on 60% the resolution of 4K (ultrawide)
MAKE IT STOP
>Native Resolution no longer matters
YESSSSS I LOVE 648P UPREZ INTO 1440P FRAMEGEN PLEASE JIZZ ON ME MORE
I tried these fancy render scaling and filters a year ago and it gave me noticeable input lag
Never used it again and lost interest. I just assume it will feel like floaty crap.
Why do I keep getting a black screen on Ubuntu when I try to update my nvidia drivers?
Oops sorry that was me, try again now 🙂
Yes. Most people just want the game to work instead of fiddling with settings. Setting DLSS to auto makes it just work.
Cringe.
480p textures upscaled to 4k will always be outmodded by people that care
It has been the case for a long time now. TAA and dynamic resolution.
In the near future we will be able to play old gb games like Pokemon Silver but have AI upscaling them to Red Dead Redemption 2 levels. We’ll be able to tell the AI even what features we want added to the game. Eventually it will get so good we won’t even need games anymore, we’ll just tell AI what sort of game we would like to play and it will make it for us. Then it will make us forget about our past
Ps4pro was able to render 4k30fps with checkerboard rendering(50% of pixels rendered). PS5 is able to render dynamic resolution 4k(4k-1080p range), every frame of which is upscaled by FSR performance(50% scale, which is 25% of pixels). Which means that while ps4pro does native ~1530p 30fps, ps5 can only do 540p-1080p render at 30 fps. Considering that graphical fidelity did not increase much since ps4pro era, and ps5 is at least 2 times stronger(raw power), I can only conclude that developers are massive fricking Black folk and stopped optimizing their games because FSR became available
I'd say one of the structural problems with gaming today is that tools have gotten better over time, and hardware has improved as well, so now they can crank out GARGANTUAN games with awful memory leak products way, way too much bloated memory use, horrible optimization and incomplete features, and the games will actually "work" well enough that they can then patch the game up to functionality without being lynched by the shareholders.
It used to be a thing that game designers were essentially coders first, having done stuff besides making games where the tolerances for failure are comparatively low. If some company hires you to write them a program that does X and it doesn't work, you're in fricking trouble, so all of the people who worked at Id for example were veterans from industrial programming and shit, people who actually knew what they were doing, you would get HEROIC efforts by these 4-digit IQ people to scrape 50 kilobytes off of a program or to reach the extraordinary heights of 12 or 13 fps and people would call you Nostradamus for doing it.
Because the technology is so much better now the artisan skill required is vanishing, sort of like how there's nobody working at Disney, or even in the entire fricking film industry who COULD produce a traditional animated film because nobody at Disney has held a Pencil since Treasure Planet flopped. Nobody actually knows how to code anymore.
AI is just a logical continuation of that process, but there's a hidden benefit, which is that eventually they'll figure out how to train an AI to clean up the code of the 80 IQ pajeets they hire to do the grunt work on games these days and suddenly everything will be hyper-optimized.
>eventually they'll figure out how to train an AI to clean up the code of the 80 IQ pajeets they hire
The problem with that statement is that someone needs to check after ai, that the project compiles and works properly, and fix bugs created by code cleaner. And they will just hire 80iq pajeet for that position that will put all the shit back in
That's certainly a possibility, but I think what's more likely is that a professional class of people whose job it is to wrangle AI will arise, sort of like coding-shamans, who get called in to train up some proprietary model to fix all of the optimization issues get rolled out of whichever slave plantation in Lebanon the company hired to do the foundational coding.
you people are just Luddites, get on or get dragged kicking and screaming onto the future.
There IS no future. We are going to collapse and a new Dark Age with come over the Earth
The Luddites were literally right about every single thing they claimed. Look up the actual story of the Luddites, the government fricking killing people for thinking hey maybe these giant textile factories where we have a big sign that says "days since an 8 year old was sucked into the automatic loom" has said 0 since the day the factory opened is BAD.
This. Anyone using Luddite as an insult or condemnation is incurably moronic.
They were middle class professionals who worked from home and they were literally murdered by rich people for trying to maintain that lifestyle instead of going quietly into an employee relationship that compares unfavorably with actual slavery.
Someone has been ignoring the entire history. We are on our way to collapse, ask literally every historian.
Rome has to burn down one day, and there will never be another Rome ever again.
After few centuries there will be great times again, but its not going to Rome or anything like now as well, something different, we will be long gone.
>American projections.
I get the feeling it's going to be reminiscent of that disgusting dogshit that is motion smoothing, so hard pass for the next few decades from me
I hate israelitevidia so much
kindof. Most people don't even notice the difference between 4k and 1080p after a few minutes. If the fake resolution makes them think they're getting something better than they are then it has done its job.
>company that went all in on ai wants to tell you how great ai shit is and how bad anything else is and wants to sell you ai shit cards
Kind of feels like 1080p is going to be the gold standard, like 60 fps. The diminishing returns is really apparent and it's just more resources for marginal returns. Yes, I know you spend a lot of money on your rig, 120 is a meme, shut the frick up. 4k is barely an upgrade that up scaling a 1080p image is the most optimal way to get a 1080p source to look pretty good on a 4k screen. Swear the 4k/120 chuds are the same types of people that think gold plated HDMI cables produce the best results. The only nice thing is I invested in 4k monitor and if it ever dies, can just go back to 1080p cause boy was that a gamble that failed.
NVIDIA are a bunch of fricking scamming israelites and they need to either be humbled or go bankrupt
I will simply stop playing. I don't care anymore
DLSS looks like ass, it's just an excuse for the competency crisis
I'm not buying a GPU with raytracing. If I want raytracing I should have special dedicated hardware, I shouldn't have to pay for something I'll never use.
Same goes for "neural" units.
I only play real games where the rendering is 100% controlled by the game developers, not some generic driver-provided filter.
None of the GPU's have raytracing, it's all software.
<No>
As a console chad, I can’t do laughing at pcToddlers rn.
Their PCMR kingdom is in shambles and the compium huffing is in full effect
Sometimes i forget who i'm talking to, and engage with consolewar gays. But then someone like you comes by to remind me just how pointless it is to discuss anything with literal morons.
Stay poor. Stay dumb.
I bet my dust collecting rig runs circles around yours lol
Perhaps. You'll be able to hit native 4K240 in Phantom Liberty one day, but by that time, more complex game worlds with much higher demands will exist.
No. Frick off with your copium tech.
it’s not like you have a choice
>muh 4K
>muh ultra
you’re not a poor peasant are you? surely the glorious PCMR only plays max settings
its over
quick question
does ray reconstruction work without frame gen?
I know it requires DLSS but what about FG?
yes, it does not need framegen
framegen is fine as long as your base fps is above 40 though, so i don't see a problem with it