I own a 1080 and it runs the games I play, why would I throw money on a minor upgrade in graphical fidelity when I can gamble it away on gacha rolls, crypto and GME instead?
AMD has always treated me better. my expensive GT card way back when burned itself out due to a Crysis bug and Njudea wouldn't honor my warranty. now I'm on Linux and everything Just Werks. no games worth playing beyond what I have now, either.
1. Resolution beyond 1080p will have zero impact on your fun and immersion.
2. It requires a lot of power and cooling.
3. The quality of gaming is deteriorating
>1. Resolution beyond 1080p will have zero impact on your fun and immersion.
This is completely subjective. I'm at 1080p240hz because I care more about framerate, but I can absolutely see why some people prefer a bigger screen.
I’m waiting for the next flagship card
And my 1080ti is still working fine for the games I want to play.
Helps that 90% of modern games are shit and aren’t worth even looking at
thanks
in a way gedonia kinda reminded me of playing runescape even though the two games aren't similar at all
>That basic shit-ass game only getting 141 FPS with 4090
I don't know which is the most sad part here. Buying these cards when they're this inflated, or video games, or how pointless new cards have become (worse than before as they still have the insanely high markup for marginal gain).
my fps cap is usually set to 141 fps because of gsync
>That basic shit-ass game only getting 141 FPS with 4090
I don't know which is the most sad part here. Buying these cards when they're this inflated, or video games, or how pointless new cards have become (worse than before as they still have the insanely high markup for marginal gain).
not getting a single gpu when my 3090 gets obsolete or dies. it's literally not worth it as gaps between quality games are getting bigger and bigger. it's vomit inducing, inconsistent 30 fps, checkerboarded 768p console gaming for me in the future. Verification not required.
im not wealthy and i financial constraints have caused me to become such a meme that i now play high demand video games through my desktop monitor, using my laptops 4050, all the while listening to the jet engine fans struggle not to torch itself for running a 2022 bloatcore game on medium settings
My wife and I were pretty fricking poor when we first got married, we spent a lot of time playing wow private servers on integrated graphics laptops in a studio apartment and that was probably the most fun I ever had playing that game, there’s something to be said for limitations improving the enjoyment of something. Getting SNES carts from the pawn shop when they were dirt cheap was a lot of fun also.
My desktop just completely melted down and I’m holding off on getting a new one for this exact reason. Have an eck, hacked switch and hacked 3ds I’m so fricking set for a while, hoping 5090 and AMD 9xxx series causes some price reductions or microcenter deals to pop up. Probably massive copium im sure, I will pay the GPU tax to klaus one way or another
They wouldn't. Used market got nuked in 2020 so no one is willing to lower prices there now that everyone knows how valuable a working gpu is. Only AMD is willing to lower prices, Nvidia is keeping the same prices and lowering supply.
Because it won't fit on my MATX case and it won't work with my 650w PSU. Meanwhile this beauty is the only thing I'll need for the next 6 years (I'm going to use it for 1080p through and through)
Have a 3080, DLSS is bretty good. Don't really feel like I need another card anytime soon for vidya, if newer games add support for it.
That being said, I am greatly considering a 5090 just for stable diffusion smut rendering. Vidya will just be icing on the cake.
>Be me > Buy high-end GTX TITAN XP > Come here on Ganker making threads >TFW No one gives a shit > Everyone talking/making threads about 1060TI,1070 and 1080 >felt like an outcast douchebag > As years go by >got the RTX 3080 >NOW Everyone talking/making threads about high RTX 3090 to 4090
I mostly went or a 3060 12GB since at the time 4000 series wasn't exactly out but after hearing rumors how bad it was gonna be... And saw how bad it turned out for anything that isn't a 4090... I just spend money on other parts when I was building my PC.
I turn DLSS on even when I don't need extra performance because it looks nearly identical, if not better, and uses less power. I don't know what point you're attempting to make. I've seen DLSS on/off and choose to keep it on.
2 weeks ago
Anonymous
well, you are moronic
You shoudnt be upscaling at all, you should be DOWNSCALING, rendering game in 150% resolution
2 weeks ago
Anonymous
and you should be buying a good GPU , but you didn't
2 weeks ago
Anonymous
you probably one of those who tells theres no difference between 1080p and 4k...
2 weeks ago
Anonymous
You seem like one of those who has never done an actual comparison of DLSS being on/off and like making shit up. I have done a lot of testing myself, DLSS on is clearly the better option in most scenarios.
2 weeks ago
Anonymous
posts a screen on which its obvious ground on the right screen is unsharp and inferior to native
Running Native doesnt mean you run without AA, moron
2 weeks ago
Anonymous
>obvious
Less obvious than the shimmering gate which will still shimmer no matter what you do without DLSS. AA is just worse version of DLSS
2 weeks ago
Anonymous
Textures clearly look worse with DLSS in that pic.
2 weeks ago
Anonymous
They look like shit in both versions. You would not notice the difference if you're actually playing the game instead of staring at a zoomed in screenshot
2 weeks ago
Anonymous
>1440p >looks like this
yes. 1440p DLSS does look like shit. So does native 1440p
2 weeks ago
Anonymous
what do you have against 1440p
2 weeks ago
Anonymous
1440p is quite blurry vs 4k.
DLSS 1440p is much worse than DLSS 4k.
2 weeks ago
Anonymous
Thankfully I enjoy playing video games instead of worrying about how many pixels are on the screen
2 weeks ago
Anonymous
If that was the case you'd still be using 720p
2 weeks ago
Anonymous
720p screens were never all that common for any extended period of time, 1080p became the norm pretty quick once 16:9 settled in.
2 weeks ago
Anonymous
it doesn't matter. You said you don't worry about pixels so why did you upgrade to your 1440p? You're full of shit. You upgraded because it looked better, and 4k looks better again.
2 weeks ago
Anonymous
HDR and refresh rate are more important that resolution. They don't make good 720p monitors with these features
2 weeks ago
Anonymous
>high refresh rates and hdr don't exist at 4k
is this where we are? lol
2 weeks ago
Anonymous
1440p monitor with dldsr to render at 4k is kino though
Made a new rig. Previous was:
I5-4440, GTX970 16GB Ram dual channel and it was getting by quite decently even up until now. New build: AMD 7800X3D, RTX 4070S FE, 64GB RAM. 4TB NVME. and new 1440P HDR G-Sync monitor.
I think I'll be just fine for another 10 years on 1440p.
Graphics are overrated in gaming anyway IMHO.
But I have to admit, this HDR thing I have now, that is nice and impressive. Other than that, fine I'll take the extra FPS and pixeld, but whatever.
Also with chips already at 5nm, how much smaller, better is it going to go? I think the rush for better graphics is gaming is finally, finally coming to a sort of end and maybe, maybe, other aspects can get more attention.
What games are worth playing on a 4090?
I spend all my time on the pc watching anime or playing indie shit. Cyberpunk worked fine on a 2080 last year. There is nothing more graphically intense than that unless you want to play a very shitty unoptimized ark.
The problem is the lack of games, we need more unreal engine 5 stuff for a 4090 to be worthwhile and by that point I'd rather get a 5090 which also won't have many games.
It's not enough to play PC games at 1440 with 100+FPS so there's no point until the 5000s come out to see if maybe that will be enough. >modded Minecraft >modded Arma >heavily modded older games like Skyrim >other modded 'tism games
Even Rimworld gets crazy when you go overboard on mods
What's your Superposition benchmark score 1080p extreme Ganker ?
Pic related is 7800X3D, RTX4070S (stock, stuff in background too, I don't overclock or anything)
Post screenshot or allegations of doubt might find their way to you. And otherwise, what specs or how much more money do incur such a nice significantly higher score than mine?
do games only count when they're on another system? if that's the case then the last PC games were made about 10 years ago and all that remains is indie shit
I'm saving up to unironically splurge on a 5090ti or whatever the best is with the most VRAM so I can generate AI pornography of VIDEO GAME characters faster. and with more detail. My 3070ti is good but it could be better
With enough VRAM both. Surprisingly you can generate AI pr0n with like 4gb of VRAM (I have 8gb). Generating GOOD AI video is slow and requires a ton of VRAM. Static images you can do on weaker hardware.
I only fap to "my own stash" now; it's great.
And I'd post what I've genned thus far but I'd be banned on site by sehr jahnitor.
>haven't tipped my toe in AI yet.
You should try it. If you're not computer illiterate it's easy. If you have a weak GPU (less 6gb VRAM) you can install Forge:
https://github.com/lllyasviel/stable-diffusion-webui-forge
If your GPU is very powerful you should install Automatic1111, which is kinda what most people use, and has the best support.
https://github.com/AUTOMATIC1111/stable-diffusion-webui/releases/tag/v1.0.0-pre
REALLY shitty GPUs need to use ComfyUI, which is very powerful but a little more complex:
https://github.com/comfyanonymous/ComfyUI
And get models (the specific data the AI uses to generate from) from CivitAI. NSFW models (porn, furry, etc) require an account but they're there. I highly recommend trying PonyXL or AutismXL :^)
https://civitai.com/
There's obviously a gigantic world of extensions and workflows to link you to, but I won't overwhelm you :^)
Enjoy your crippling porn roulette addiction.
>Generating GOOD AI video is slow and requires a ton of VRAM
A shitty card can generate like 10 images per minute in stable diffusion, so I guess that means 3 minutes for 30 frames which would be about 1 minute of video?
Oh yeah, something I want to add if you're thinking of getting into AI:
Everything I recommended () is FREE and OPEN SOURCE. You should only have to pay for your AI sloppa if you have a weak GPU and need to use a service imo. Nothing wrong with it, just not preferable if you can do it locally on your own shit, right?
To add onto this: I have 8gb of VRAM and use Forge. This is specifically so I can use XL models, since XL models take more VRAM (and memory) to load. Forge is optimized for faster loading and making more use of your VRAM. Automatic1111 is perfectly acceptable for 90% of users. Forge is less supported but looks (and functions) similar to Automatic1111's UI, it's shrimply better with VRAM. I still recommend you try Automatic1111 first before Forge or ComfyUI.
t. knower
>Generating GOOD AI video is slow and requires a ton of VRAM
A shitty card can generate like 10 images per minute in stable diffusion, so I guess that means 3 minutes for 30 frames which would be about 1 minute of video?
I really don't know. There ARE video extensions for generating video in stable diffusion. This is only one example:
https://github.com/Scholar01/sd-webui-mov2mov
Forge has a whole TAB for AI motion graphics.
I havent built a PC in 8 years and im buying a 5090.
How would one even use it though? Just type shit like "tifa in bikini on a beach" and wait for the image or something?
Or do you need to add some reference pics for that?
The only time I used AI was copilot to write some basic python scripts because I'm too lazy to learn programming.
Never used stable diffusion and whatever else.
>How would one even use it though? Just type shit like "tifa in bikini on a beach" and wait for the image or something?
Yep. For Forge and Automatic1111 there's a positive and negative prompt box. You put what you want in the image in the positive prompt box, and what you DON'T want the AI to generate in negatives.
The model I use is tag-based, so I put tags from boorus into the positive and negative prompts.
If I want Tifa on the beach, I might use these tags in the positive promt:
>(tifa on the beach:1.1) | the parentheses and :number tell the AI to add emphasis to this part of the prompt during generation. The :number goes to 1.9 >beach >swimwear (or whatever you want to see her wearing) >outdoors >ocean >sfw
And in negatives: >ugly >topless >(underwear) | again, notice the parentheses to add emphasis. I do NOT want the AI to confuse bikini with underwear and the parentheses tell the AI that.
That is a VERY loose example:
>Or do you need to add some reference pics for that?
No, that's what img2img is for. Typically a workflow might look like: >txt2img to get base image >send base image to img2img tab to upscale and refine.
>Never used stable diffusion and whatever else.
XL Models will surprise you. Otherwise, you'll still get okay-ish results, you just need to be a little patient. Make use of textual embeddings for the negative prompt (think of textual embeddings as bundles of negative prompts) and LORAs for the positives.
>Post gens
I would but I HAVE been attacked by Janny-kun before
Oh yeah, I forgot to add, the tags are comma separated. So they'd look like this in the positives: >(tifa on the beach:1.1), beach, swimwear, outdoors, ocean, sfw
Apply the same to the negatives. Add or subtract emphasis with (:number). You can also do (breasts:0.6) and have "huge breasts" (no quotes) in the negatives for really small breasts. You can do "flat chest" in positives, "breasts" in negatives for flat characters. There's a trillion ways to do it. It's fun.
Oh yeah, I forgot to add, the tags are comma separated. So they'd look like this in the positives: >(tifa on the beach:1.1), beach, swimwear, outdoors, ocean, sfw
Apply the same to the negatives. Add or subtract emphasis with (:number). You can also do (breasts:0.6) and have "huge breasts" (no quotes) in the negatives for really small breasts. You can do "flat chest" in positives, "breasts" in negatives for flat characters. There's a trillion ways to do it. It's fun.
>Not: 1girl, solo, tifa on the beach, beach, skimpy, sling bikini
ngmi
Starting with Pascal, the even numbered nvidia gens are skip gens 10 series was an unprecedented leap in performance, so Nvidia made 20 series shitty. 30 series was a big performance gain over 20 series, so 40 series was a wet fart barring the 4090. Following the trend, the 50 series will be great and the 60 series will either be a very marginal improvement or worse.
Came from 970
And wanted to upgrade now, not wait longer. 4070S. Somewhat decent, so upgraded.
Care about 50-series maybe being 'great'? not really, will look up some GPU charts again in another 10 years.
It's all so tiresome.
I'm just waiting for 5090
My 2080 I bought day 1 still runs every new game perfectly but I'm worried that it'll just drop dead one day because it's getting pretty old for a GPU.
i have a 3090 but the only games that I could stomach playing for more than hour that actually pushed the hardware were Helldivers 2 and Cyberpunk, everything else I played since buying this thing could've run on my old 1070
Bros, I have a 3080 12 gb.
Realistically how many more years do you think I'll be able to hold off on an upgrade?
Was thinking until 6000 series or whatever AMD equivilent.
But I'm not gonna lie, the 5000 series has me tempted to see if I can make the jump to 4k. Obviously only going to wait until benchmarks are out.
>implying
nice google pic saar
homie
is literally my gpu
>you vs the guy she told you not to worry about
I just got a 4070 Super
>2025
>not owning an NVIDIA® GeForce RTX™ 5090
Explain yourself
I will get a 5090. And my 4090, picrel, will serve in my vacation home pc. Is this a decadent thing to do?
yea
Well, so be it.
I own a 1080 and it runs the games I play, why would I throw money on a minor upgrade in graphical fidelity when I can gamble it away on gacha rolls, crypto and GME instead?
Because I have a 1080
AMD has always treated me better. my expensive GT card way back when burned itself out due to a Crysis bug and Njudea wouldn't honor my warranty. now I'm on Linux and everything Just Werks. no games worth playing beyond what I have now, either.
1. Resolution beyond 1080p will have zero impact on your fun and immersion.
2. It requires a lot of power and cooling.
3. The quality of gaming is deteriorating
>1. Resolution beyond 1080p will have zero impact on your fun and immersion.
This is completely subjective. I'm at 1080p240hz because I care more about framerate, but I can absolutely see why some people prefer a bigger screen.
I’m waiting for the next flagship card
And my 1080ti is still working fine for the games I want to play.
Helps that 90% of modern games are shit and aren’t worth even looking at
my $300 RTX 4060 runs everything a $1400 RTX 4090 could
cope
I do and I use it to play stuff like this
>gedonia
Good taste anon
thanks
in a way gedonia kinda reminded me of playing runescape even though the two games aren't similar at all
my fps cap is usually set to 141 fps because of gsync
>That basic shit-ass game only getting 141 FPS with 4090
I don't know which is the most sad part here. Buying these cards when they're this inflated, or video games, or how pointless new cards have become (worse than before as they still have the insanely high markup for marginal gain).
I already have a 3080 that runs every game fine and is overkill for stuff that's a bit older
not getting a single gpu when my 3090 gets obsolete or dies. it's literally not worth it as gaps between quality games are getting bigger and bigger. it's vomit inducing, inconsistent 30 fps, checkerboarded 768p console gaming for me in the future. Verification not required.
You could just get a low-end and set everything to low, still better than consoles
I'm not moronic enough to spend that kind if money on a card that needs fake frames lol
Idk just seems israeli so I don’t want to buy it, no I won’t explain further
Because I have a 3070 and it's good enough even with 8gb of ram.
im not wealthy and i financial constraints have caused me to become such a meme that i now play high demand video games through my desktop monitor, using my laptops 4050, all the while listening to the jet engine fans struggle not to torch itself for running a 2022 bloatcore game on medium settings
Based, reminds me of 10+ years ago when I was hogging my grandma's laptop to play on the i5's integrated graphics plugged to my monitor.
My wife and I were pretty fricking poor when we first got married, we spent a lot of time playing wow private servers on integrated graphics laptops in a studio apartment and that was probably the most fun I ever had playing that game, there’s something to be said for limitations improving the enjoyment of something. Getting SNES carts from the pawn shop when they were dirt cheap was a lot of fun also.
t. grew up florida white trash
yeah, people are not really built for luxury. our enjoyment takes a nosedive. still worth it in most cases.
No games worth buying 4090 for.
B-but muh Cyberpunk on 4k with path tracing
I said games not a 1(one) game.
Uhhh
R-ratchet and Clank? A-Alan Wake 2? P-p-please care about RT at 4k...
>buying 4090 when 5xxx are about to come out
Bro...
My desktop just completely melted down and I’m holding off on getting a new one for this exact reason. Have an eck, hacked switch and hacked 3ds I’m so fricking set for a while, hoping 5090 and AMD 9xxx series causes some price reductions or microcenter deals to pop up. Probably massive copium im sure, I will pay the GPU tax to klaus one way or another
im poor
bump
>there are people who have expensive gaming PCs and the only things that they do is browse and shitpost in the Ganker
You silly boys!
Fun fact the 4090 has almost NEVER actually been available at the 1600 MSRP. Right now the cheapest one is an
>MSI
$1650 one.
I saw some $1600 models last year just after summer and right before the chinese export ban
to think if not for that 4090s would have gotten cheaper
They wouldn't. Used market got nuked in 2020 so no one is willing to lower prices there now that everyone knows how valuable a working gpu is. Only AMD is willing to lower prices, Nvidia is keeping the same prices and lowering supply.
Why is it still over $1,599 so late in the cycle?
I'm a poorgay, am I welcome here?
welcome poorbro
imagine only having 16gbs of vram
Has a game ever used more than 16GB VRAM?
More and more (sony) console ports are hitting 10-15gb vram. Expect it to be the norm by 2026
Other than Ratchet and Clank which was previously mentioned, like what?
it's gonna take like 5 years until games need more VRAM than that, at least until PS6 comes out.
Literally never seen a game use more than 8GB
Ratchet and Clank at 4k went to 14GB that's the most I've seen. Most games stop at 11-12GB
>4k
That'll do it alright
Reason #521 why anything over 2k is silly
>imagine only having 16gbs of vram
How many people here are actually using the 4090's 24GB VRAM to its full potential?
weg devs
It's OK i only have played p3r/p4g/p5r and elden ring since I got my 4080S
you should go 10 years back and see how companies lie to us
>64 RAM GDDR 3
>4 WAYS SLI
>1300 W PS
me too
to play what? AAA slop and indies with homosexuals? no thanks
i will keep playing old games and emulate
I hope the 5090 will be even more expensive to make most of you seethe even more.
have mercy on my wallet man
I'm probably buying 2 of them so I hope it's not that much more
>already have an RTX 3080 OC ASUS TUF
I am withing for RTX 5090
>Heatsink causes my house to sink below ocean level for optimal water cooling
Sorry goyim, my 1070 still works.
But I do
>500W only for the GPU
yeah look, I just don't want my house to burn down
it must be worrying living in a run down shack where anything more than a usb charger can cause fires
you can power limit it to under 300w and still keep 90%
why bother, it's not like he can afford it
I goon to your setups btw
>one fan
lol
spotted the homosexual who has never seen a 4090
I don't see much point in spending over $1599 plus tax on a 4090 when the 5090 is right around the corner.
4070TI super is good enough
>tfw evga still dead
i only have a 1080p monitor so i settled with a 4070 instead.
Because it won't fit on my MATX case and it won't work with my 650w PSU. Meanwhile this beauty is the only thing I'll need for the next 6 years (I'm going to use it for 1080p through and through)
I'm still at 1080p
It costs more than an entire Indian village
Frame gen FSR works on my 3070, eliminating any need for a new GPU
Im fairly certain it wont fit in my case, and even if it does, it might frick up my motherboard as well
>1660
>plays elden ring perfectly fine
why would I upgrade?
Have a 3080, DLSS is bretty good. Don't really feel like I need another card anytime soon for vidya, if newer games add support for it.
That being said, I am greatly considering a 5090 just for stable diffusion smut rendering. Vidya will just be icing on the cake.
Which version of 4080 super do I get
Cheapest one you can find, avoid the TUF
>avoiding the only one with 2 HDMI ports when the Display Ports are gimped
absolutely moronic
It is actually a little bit weaker than the FE, enjoy your gimped 4080
I don't live in siberia so I don't need a compact heater and fire machine
If the 4080 can be $200 cheaper with the Super variant, I don't see why the 4090 can't also get a price drop.
why would they drop the price of something that is always sold out? They only dropped 4080 Super because no one gave a shit about the 4080
And play what on it, Gay Anal Parasites 9?
>Be me
> Buy high-end GTX TITAN XP
> Come here on Ganker making threads
>TFW No one gives a shit
> Everyone talking/making threads about 1060TI,1070 and 1080
>felt like an outcast douchebag
> As years go by
>got the RTX 3080
>NOW Everyone talking/making threads about high RTX 3090 to 4090
you gays are buttholes
I'm broke, homie
*SHADER STUTTERS IN YOUR PATH*
At least whatever studio is doing PlayStation ports have figured this out
one of the benefits of directstorage is that your gpu compiles shaders super fast instead of your cpu
I mostly went or a 3060 12GB since at the time 4000 series wasn't exactly out but after hearing rumors how bad it was gonna be... And saw how bad it turned out for anything that isn't a 4090... I just spend money on other parts when I was building my PC.
i own superior 7900xtx
4090 is meme for "lametracing" and fake 4k
with 7900XTX you can play 4k nativer on ultra with no shitty upscaling
what is the 7900XTX superior at?
more RAM, better native 4k and image quality
nvidia gays cant imagene how it is to play games with sharp graphics without upscaling
More like AMD gays can't imagine how it is to have an upscaler that isn't dogshit.
proving my point, playing shitty unsharp upscaled shit instead how games are supposed to be played in native resolution
plus you illustrate why nvidia is shit - its native pic is not "native", nvidia always dropped render quality to cheat fps
pluy
https://www.amd.com/en/products/software/adrenalin/radeon-image-sharpening.html
I turn DLSS on even when I don't need extra performance because it looks nearly identical, if not better, and uses less power. I don't know what point you're attempting to make. I've seen DLSS on/off and choose to keep it on.
well, you are moronic
You shoudnt be upscaling at all, you should be DOWNSCALING, rendering game in 150% resolution
and you should be buying a good GPU , but you didn't
you probably one of those who tells theres no difference between 1080p and 4k...
You seem like one of those who has never done an actual comparison of DLSS being on/off and like making shit up. I have done a lot of testing myself, DLSS on is clearly the better option in most scenarios.
posts a screen on which its obvious ground on the right screen is unsharp and inferior to native
Running Native doesnt mean you run without AA, moron
>obvious
Less obvious than the shimmering gate which will still shimmer no matter what you do without DLSS. AA is just worse version of DLSS
Textures clearly look worse with DLSS in that pic.
They look like shit in both versions. You would not notice the difference if you're actually playing the game instead of staring at a zoomed in screenshot
>1440p
>looks like this
yes. 1440p DLSS does look like shit. So does native 1440p
what do you have against 1440p
1440p is quite blurry vs 4k.
DLSS 1440p is much worse than DLSS 4k.
Thankfully I enjoy playing video games instead of worrying about how many pixels are on the screen
If that was the case you'd still be using 720p
720p screens were never all that common for any extended period of time, 1080p became the norm pretty quick once 16:9 settled in.
it doesn't matter. You said you don't worry about pixels so why did you upgrade to your 1440p? You're full of shit. You upgraded because it looked better, and 4k looks better again.
HDR and refresh rate are more important that resolution. They don't make good 720p monitors with these features
>high refresh rates and hdr don't exist at 4k
is this where we are? lol
1440p monitor with dldsr to render at 4k is kino though
why did you feel the need to put a red circle in there and distract from your flawed, awful looking garabage technology?
The camera was moved slightly which changes the reflection are you a moron
Because i'm a actual PChad and i won't waste my money when i have a 3090ti
What games should I play with it?
broke and currently cucking on constantly crashing amd radeon gpu
But I do have one, OP
Made a new rig. Previous was:
I5-4440, GTX970 16GB Ram dual channel and it was getting by quite decently even up until now. New build: AMD 7800X3D, RTX 4070S FE, 64GB RAM. 4TB NVME. and new 1440P HDR G-Sync monitor.
I think I'll be just fine for another 10 years on 1440p.
Graphics are overrated in gaming anyway IMHO.
But I have to admit, this HDR thing I have now, that is nice and impressive. Other than that, fine I'll take the extra FPS and pixeld, but whatever.
Also with chips already at 5nm, how much smaller, better is it going to go? I think the rush for better graphics is gaming is finally, finally coming to a sort of end and maybe, maybe, other aspects can get more attention.
I'm not cattle
I do! Since day one, in Oct 2022. What do you think I am? An AMD GPU owner? LMAO!
No one is actually an AMD owner. AMD pays shills to post on forums because the ratio of AMD shills to people who actually own their GPUs is way off.
even reddit was agreeing with it today. amd gays just suggest others to buy amd while they themselves get nvidia cards
Too expensive. Where I live the RTX 4090 is around 1800 dollars and my monthly salary is around 1150 dollars as a software tester
got a 1080p monitor
I'm poor. 🙁
>t. 1080 + i7-4790k
How did you afford those high end parts back then?
Shit was way better back then.
What games are worth playing on a 4090?
I spend all my time on the pc watching anime or playing indie shit. Cyberpunk worked fine on a 2080 last year. There is nothing more graphically intense than that unless you want to play a very shitty unoptimized ark.
The problem is the lack of games, we need more unreal engine 5 stuff for a 4090 to be worthwhile and by that point I'd rather get a 5090 which also won't have many games.
It's not enough to play PC games at 1440 with 100+FPS so there's no point until the 5000s come out to see if maybe that will be enough.
>modded Minecraft
>modded Arma
>heavily modded older games like Skyrim
>other modded 'tism games
Even Rimworld gets crazy when you go overboard on mods
Aren't those more CPU bound use cases?
What's your Superposition benchmark score 1080p extreme Ganker ?
Pic related is 7800X3D, RTX4070S (stock, stuff in background too, I don't overclock or anything)
> https://benchmark.unigine.com/superposition
14854
Post screenshot or allegations of doubt might find their way to you. And otherwise, what specs or how much more money do incur such a nice significantly higher score than mine?
i winz
Damn Son!
my amd card
realized i have an fps cap on so maybe i should run without?
no better
its fine. its also interesting that 4090 is almost 2x fps of a 6900xt.
but i dont think its worth the markup
>update drivers
>win
am I the new king of Ganker?
I do have one but I dont know how to get the most out of it outside of running all my games on maximum settings.
its okay. its just too bad im only trained to play multiplayer sloppa
For what? New games that are shit anyway? Why waste the money?
hundred replies zero games
yeah total worth getting a 4090
also can I recommend the playstation 5
>can i recommend the 1060
lmao @ consoleBlack folk
we'll get to 300 replies with no mention of games this is Ganker
Why would you recommend a PS5 if you want games? No you can't recommend one.
do games only count when they're on another system? if that's the case then the last PC games were made about 10 years ago and all that remains is indie shit
>b-but
ps5 has less games than the xbox, cope snoy
You see the post above yours, that's my PC.
I'm saving up to unironically splurge on a 5090ti or whatever the best is with the most VRAM so I can generate AI pornography of VIDEO GAME characters faster. and with more detail. My 3070ti is good but it could be better
Is the AI porn thing video or images? I honestly don't know, haven't tipped my toe in AI yet.
With enough VRAM both. Surprisingly you can generate AI pr0n with like 4gb of VRAM (I have 8gb). Generating GOOD AI video is slow and requires a ton of VRAM. Static images you can do on weaker hardware.
I only fap to "my own stash" now; it's great.
And I'd post what I've genned thus far but I'd be banned on site by sehr jahnitor.
>haven't tipped my toe in AI yet.
You should try it. If you're not computer illiterate it's easy. If you have a weak GPU (less 6gb VRAM) you can install Forge:
https://github.com/lllyasviel/stable-diffusion-webui-forge
If your GPU is very powerful you should install Automatic1111, which is kinda what most people use, and has the best support.
https://github.com/AUTOMATIC1111/stable-diffusion-webui/releases/tag/v1.0.0-pre
REALLY shitty GPUs need to use ComfyUI, which is very powerful but a little more complex:
https://github.com/comfyanonymous/ComfyUI
And get models (the specific data the AI uses to generate from) from CivitAI. NSFW models (porn, furry, etc) require an account but they're there. I highly recommend trying PonyXL or AutismXL :^)
https://civitai.com/
There's obviously a gigantic world of extensions and workflows to link you to, but I won't overwhelm you :^)
Enjoy your crippling porn roulette addiction.
>Generating GOOD AI video is slow and requires a ton of VRAM
A shitty card can generate like 10 images per minute in stable diffusion, so I guess that means 3 minutes for 30 frames which would be about 1 minute of video?
nice math, shart
Oh yeah, something I want to add if you're thinking of getting into AI:
Everything I recommended () is FREE and OPEN SOURCE. You should only have to pay for your AI sloppa if you have a weak GPU and need to use a service imo. Nothing wrong with it, just not preferable if you can do it locally on your own shit, right?
To add onto this: I have 8gb of VRAM and use Forge. This is specifically so I can use XL models, since XL models take more VRAM (and memory) to load. Forge is optimized for faster loading and making more use of your VRAM. Automatic1111 is perfectly acceptable for 90% of users. Forge is less supported but looks (and functions) similar to Automatic1111's UI, it's shrimply better with VRAM. I still recommend you try Automatic1111 first before Forge or ComfyUI.
t. knower
I really don't know. There ARE video extensions for generating video in stable diffusion. This is only one example:
https://github.com/Scholar01/sd-webui-mov2mov
Forge has a whole TAB for AI motion graphics.
BASED
How would one even use it though? Just type shit like "tifa in bikini on a beach" and wait for the image or something?
Or do you need to add some reference pics for that?
The only time I used AI was copilot to write some basic python scripts because I'm too lazy to learn programming.
Never used stable diffusion and whatever else.
>How would one even use it though? Just type shit like "tifa in bikini on a beach" and wait for the image or something?
Yep. For Forge and Automatic1111 there's a positive and negative prompt box. You put what you want in the image in the positive prompt box, and what you DON'T want the AI to generate in negatives.
The model I use is tag-based, so I put tags from boorus into the positive and negative prompts.
If I want Tifa on the beach, I might use these tags in the positive promt:
>(tifa on the beach:1.1) | the parentheses and :number tell the AI to add emphasis to this part of the prompt during generation. The :number goes to 1.9
>beach
>swimwear (or whatever you want to see her wearing)
>outdoors
>ocean
>sfw
And in negatives:
>ugly
>topless
>(underwear) | again, notice the parentheses to add emphasis. I do NOT want the AI to confuse bikini with underwear and the parentheses tell the AI that.
That is a VERY loose example:
>Or do you need to add some reference pics for that?
No, that's what img2img is for. Typically a workflow might look like:
>txt2img to get base image
>send base image to img2img tab to upscale and refine.
>Never used stable diffusion and whatever else.
XL Models will surprise you. Otherwise, you'll still get okay-ish results, you just need to be a little patient. Make use of textual embeddings for the negative prompt (think of textual embeddings as bundles of negative prompts) and LORAs for the positives.
>Post gens
I would but I HAVE been attacked by Janny-kun before
Oh yeah, I forgot to add, the tags are comma separated. So they'd look like this in the positives:
>(tifa on the beach:1.1), beach, swimwear, outdoors, ocean, sfw
Apply the same to the negatives. Add or subtract emphasis with (:number). You can also do (breasts:0.6) and have "huge breasts" (no quotes) in the negatives for really small breasts. You can do "flat chest" in positives, "breasts" in negatives for flat characters. There's a trillion ways to do it. It's fun.
>Not: 1girl, solo, tifa on the beach, beach, skimpy, sling bikini
ngmi
>whatever the best is with the most VRAM
Then you need to get to the datacenter catalogue and get $10,000 ready.
Just got me a 4070 super, pretty happy with it.
Because I bought a 4070 and that's fine
I havent built a PC in 8 years and im buying a 5090.
I have a 3090
I am a third-worlder so all i could afford without putting a dent in my pocket was the RTX 3060 Ti
Good morning saar
I am not a canadian though
I suspect 50% of Ganker to be on
> i7 4770K
> GTX 1060 maybe 3060?
> 16 GB RAM
And there is nothing wrong with that.
Just have fun Ganker
<3
>Explain yourself
why would I spend that much money when all AAA games that would need it are pozzed garbage?
my pc can't even handle the 6 gb 3050. it fricking freezes at random.
Because I don't play modern game slop and anything above 1080p 60fps is a meme.
>anything above what my poorgay shitbox can run is a meme
Yes
Starting with Pascal, the even numbered nvidia gens are skip gens 10 series was an unprecedented leap in performance, so Nvidia made 20 series shitty. 30 series was a big performance gain over 20 series, so 40 series was a wet fart barring the 4090. Following the trend, the 50 series will be great and the 60 series will either be a very marginal improvement or worse.
Came from 970
And wanted to upgrade now, not wait longer. 4070S. Somewhat decent, so upgraded.
Care about 50-series maybe being 'great'? not really, will look up some GPU charts again in another 10 years.
It's all so tiresome.
I'm just waiting for 5090
My 2080 I bought day 1 still runs every new game perfectly but I'm worried that it'll just drop dead one day because it's getting pretty old for a GPU.
why is 4k score and fps higher than 1080p?
"optimized" is like low/medium settings
extreme is well...extreme settings
8k Extreme give me that AMD running Cyberpunk with path tracing experience
>72 degrees
i keked
>ps5 removing 8k from box art
YOU LIKE ANIME?
IM ANIME ALL OVER
BECAUSE IM EXTREEEEEEEEEEEEEEEEEEEEEEEEEEEEEME
>2024
>there are people still playing on 1080p
>the same resolution from 20 years ago
>1080p in 2004
my crt was 1440p newbie
i have a 3090 but the only games that I could stomach playing for more than hour that actually pushed the hardware were Helldivers 2 and Cyberpunk, everything else I played since buying this thing could've run on my old 1070
>Cyberpunk
>pushing hardware
i don't fricking know what you're smoking but Cyberpunk runs on 1060 6gb on ultras at stable 70-80 fps
path tracing dummy, activating it makes Cyberpunk become the worst performing game there is
at what 720p? my 1070 could barely handle it at 1080p medium
>buys a card that clearly says 1070
>tries to play at 1080p instead of 1070p
im not being paid to shill on Ganker unfortunately
i'm open to take the position though, fax me the details
https://investors.micron.com/news-releases/news-release-details/micron-samples-next-gen-graphics-memory-gaming-and-ai
2024 NOT OWNING RTX 50 GDDR7
NGMI
I can't wait for the 5060 to have 4gb of GDDR7.
>just bought a 4070 super
>5000 series has the big gains and my card will already be a generation behind
NOOOOOOOOOOOOOOOOOOOOOO
>buy a new 4070
>play Dredge
buy an ad shill
because I have a 3080ti and will be waiting for the 5080ti
xx90 cards are overkill
I can just buy a 5070 TI soon that will be the same performance for half the cost
Will this get the job done?
I dont want to run everything in Ultra settings anyway.
Weak performance. 6750 XT and A770 are going for the same price and perform better. 4060 is also faster but it's a vramlet.
4060 is better for games. Less VRAM sure but GPU core is too weak to drive it anyway.
Buy an ad Black
I'm waiting for the 7070, then I'll get an 8080, then a 9090.
Why would I? A 4090 can’t even reach stable 60fps in the new test drive demo. lol lmao
Unless your old card died or something there is no reason to get that over waiting for the 5000 series.
Bros, I have a 3080 12 gb.
Realistically how many more years do you think I'll be able to hold off on an upgrade?
Was thinking until 6000 series or whatever AMD equivilent.
But I'm not gonna lie, the 5000 series has me tempted to see if I can make the jump to 4k. Obviously only going to wait until benchmarks are out.
zero (0) games
>1080Ti
>Memetracing off
>Runs elden ring on high at 60fps
Why should I upgrade again?
>it's an "autist has nothing else to spend his neetbux on" episode
I have a RTX 2070?
im from india