No. The newest GPUs don't even use PCIe 5.0, not even the RTX 4090. They work on a PCIe 5.0 motherboard because those boards are backwards compatible (for GPUs, not CPUs) but the GPU won't perform any differently on the PCIe 5.0 board versus the PCIe 4.0 board.
Watch this video if you want a super in depth explanation: https://www.youtube.com/watch?v=v2SuyiHs-O4
Not that the anon you're replying to is correct (inherently), but the reason they take so long to load is every dev has completely removed all precaching, or they try to bake it in at menus and just... stop doing it when you load in, to load as you go (leading to HORRENDOUS popin)
more vram is beneficial in this, but by and large, it's beneficial at higher res. 8gb is enough for 1080p, but 1080p is the clueless normalfag's res of choice, now.
VRAM size has very little to do with framerate. you either have enough or you don't. if you don't, textures stop loading or the game starts stuttering, NOT running consistently at a lower framerate.
in reality the issue is solved by reducing the texture setting, the render resolution, lowering shadow resolution, things like that. and before devs started relying on the 9th gen consoles as their new target hardware we never had any problems with 6GB VRAM, let alone 8GB. it's just lazy console ports again, multiplat gaming never changes.
Studios put out unoptimized piles of shit so nvidia can market overpriced weak video cards all thanks to fucking dlss which is what they use to measure and market their newer video card's performance. The only two cards worth buying from that line up is the 4080 or 4090 and even then they're still overpriced to hell and back.
no game came close to utilizing the full 8gb in 1080p 7 years ago
while I do agree there should be some scaling up , I think inflation are hitting their big margin and they still want to keep them , That's why they are skimping on vram
Yeah, but there are games (and not shitty disgusting ports like TLOU) that already require more than 8vram for 1080p, like Forza Horizon 5.. and I don't remember what other game I saw that exceeded 8... but if forza, which is a game from a few years ago, requires more than 8GB, why isn't it unreasonable to think that the death of 8GB is near?
I can definitely tell the difference between 1080p and 1440p. Framerates after 60 Hz are a placebo though, so 60 Hz is the same as 144 Hz. [email protected]> [email protected]
probably tested 144fps on a 60hz display OR the game didn't reach 144fps on a 144hz panel
or a very slow paced game with camera panning slowly
or just very obtuse and lying
to be genuinely unable to notice higher fluidity due to better refresh rate your brain would literally, not for emphasis, need to be broken and your kinetic vision so shit you'd barely be able to function irl
>60 Hz is the same as 144 Hz
Bait, but whether it's a meaningful improvement depends on the game. FPS or fightan? Sure. Third person cover poker? Not really.
>the consoles have 16gb, and most games are cross platform, 8gb is 2x less than what we should be getting, simple as that
You fail to mention that the 16GB of RAM that consoles have is shared RAM, meaning both the CPU and GPU use it, while in PC we have DRAM for the CPU and VRAM for the GPU.
You're not entirely incorrect, though, because the shared RAM in consoles is faster than the DRAM in PCs, and the shared RAM also means consoles only need to load assets and instructions into the RAM only once, where as with the PC they need to load a lot of assets/instructions into the RAM twice, once into the DRAM and once into the VRAM. Because of that, the PC needs more combined RAM (DRAM + VRAM) than the consoles to perform the same, but you wouldn't need 16GB of VRAM to get the same performance as consoles, as certain instructions are only needed by the CPU and other instructions are only needed by the GPU.
That's not completely accurate either. The shared 16GB of RAM on consoles can be divided between the CPU and GPU however the game engine sees fit. In practice the GPU often needs more RAM than the CPU, though, so it wouldn't surprise me if indeed over half of that 16GB of RAM on consoles is used by the GPU.
2 weeks ago
Anonymous
the problem is that both the cpu and gpu access the same pool of memory
on pc you have a bottleneck by having to put things into ram and then into vram that's why ram overclocking makes such a massive difference when it comes to 1% lows and stutters
2 weeks ago
Anonymous
>The unit ships with 16 GB of GDDR6 SDRAM, with 10 GB running at 560 GB/s primarily to be used with the graphics system and the other 6 GB at 336 GB/s to be used for the other computing functions. After accounting for the system software, about 13.5 GB of memory will be available for games and other applications, with the system software only drawing from the slower pool
2 weeks ago
Anonymous
Sauce? Because none of this info adds up to what I learned/found about the PS5. Every source I found claims it has 16GB of GDDR6 SDRAM running at 448 GB/s with a 256-bit memory bus width, and then it has another DDR4 RAM chip of only 512 MB for background tasks. the 16GB of GDDR6 SDRAM is used by games, and how it divides up that RAM between GPU and CPU tasks is completely up to the game engine.
2 weeks ago
Anonymous
That's the wikipedia page of the xbox series x, so maybe the ps5 is indeed different
2 weeks ago
Anonymous
>so maybe the ps5 is indeed different
It is, only XSX has split memory bandwidth and some suspect it's the reason why it's not performing as well as PS5 despite being faster on paper. The bandwidth is honestly kinda shit even without the split.
512GB/s can't fully feed a 60 CU RX 6800 and it needs L4 cache yet Xbox Series X has 560/336GB/s without L4 feeding 52 CUs. And consoles are targeting resolution over framerate so the bandwidth matters more. 4070 can sorta get away with 500GB/s because nvidia has MUCH better memory compression than AMD and you're also not expected to run games at 4k with a 4070.
He's talking about the Series X.
Afaik the PS5 has 2GB of the 16 reserved for the OS.
Series X is split into a faster pool of 10GB and a slower pool of 6GB. Out of those 6GB I think 4 are presently available for games (I remember hearing a while back that MS had freed up several hundred megabytes for use in games on Series S and X).
Series S is split into 8GB of faster memory and 2GB of slower memory. The 2GB are used by the OS and the 8 of faster memory are used for games, so games don't use anything from the slower memory pool.
>You fail to mention that the 16GB of RAM that consoles have is shared RAM, meaning both the CPU and GPU use it, while in PC we have DRAM for the CPU and VRAM for the GPU.
dosnt matter, as you said the gpu ram bandwith is too low on the pc to use anything out of ram for the current frame, so it has to store the same info twice, once in gpu, once in ram
and sure the consoles system eats up 3gb or what ever, and you need to preload shit you dont need right now to another 3gb, its still not 8gb, modern gpus need 12gb to match console ports, and since most of the run with upscaling and other gimmick native render on pcs need 16gb due to more frame buffer usage
16gb is the minimum middle range gpus should have right now
You more or less repeated what I said and yet you still fail to see the point.
I can guarantee you that unless you're dealing with an absolutely garbage port, you do not need 16GB of VRAM in order to run a game at exactly the same graphical settings as what the consoles provide.
Hell, even with shitty ports, you still don't need 16GB of VRAM if all you want to do is MATCH the looks and performance of a console. Look at Jedi Survivor, for example. Shit port, needs 16GB of VRAM to run it on max graphics 4k 60FPS, but that's not how the consoles run that game. The consoles run that game at ~720p 60FPS with upscaling to 4k. I can guarantee you, if you run Jedi Survivor on PC at 720p 60FPS, you don't need 16GB of VRAM.
2 weeks ago
Anonymous
>you do not need 16GB of VRAM in order to run a game at exactly the same graphical settings as what the consoles provide.
you mean silky smooth cinematic 20fps and upsacaled 720 with DRS and checkerboard?
most pcs run at higher res, which is a lot more vram usagem then you have just basic architecture overhead you want to have, truth is that no matter the engine or dev, next gen titles ported to pcs run like shit on 8gb
be it the new RE4remake on camcop inhouse negine, unreal 5 titles like that dead space clone, flopspoken or square engine, or last of us remake, they all struggle on 8gb and you need to lower the settings ( there are also cpu problems, but thats another shitshow )
2 weeks ago
Anonymous
Okay but that's all besides the point. This anon here
yeah, for low and medium texture settings...
the consoles have 16gb, and most games are cross platform, 8gb is 2x less than what we should be getting, simple as that
said 16GB of VRAM is the minimum we should get because that's what consoles have, which makes no sense and is based on absolutely nothing.
2 weeks ago
Anonymous
>based on absolutely nothing.
its based on a 3 year old console based on 2 generation old gpu architecture
if they have 16gb, why would the "next next gen" gpus only have 8gb?
2 weeks ago
Anonymous
I'll refer you back to my original reply as opposed to typing it all over again:
>the consoles have 16gb, and most games are cross platform, 8gb is 2x less than what we should be getting, simple as that
You fail to mention that the 16GB of RAM that consoles have is shared RAM, meaning both the CPU and GPU use it, while in PC we have DRAM for the CPU and VRAM for the GPU.
You're not entirely incorrect, though, because the shared RAM in consoles is faster than the DRAM in PCs, and the shared RAM also means consoles only need to load assets and instructions into the RAM only once, where as with the PC they need to load a lot of assets/instructions into the RAM twice, once into the DRAM and once into the VRAM. Because of that, the PC needs more combined RAM (DRAM + VRAM) than the consoles to perform the same, but you wouldn't need 16GB of VRAM to get the same performance as consoles, as certain instructions are only needed by the CPU and other instructions are only needed by the GPU.
2 weeks ago
Anonymous
and yet you are wrong, as proven by every single big AAA cross platform release, not only is 8gb not enough, 12gb struggles at 4k
2 weeks ago
Anonymous
I am not wrong, and I already debunked this too, here:
You more or less repeated what I said and yet you still fail to see the point.
I can guarantee you that unless you're dealing with an absolutely garbage port, you do not need 16GB of VRAM in order to run a game at exactly the same graphical settings as what the consoles provide.
Hell, even with shitty ports, you still don't need 16GB of VRAM if all you want to do is MATCH the looks and performance of a console. Look at Jedi Survivor, for example. Shit port, needs 16GB of VRAM to run it on max graphics 4k 60FPS, but that's not how the consoles run that game. The consoles run that game at ~720p 60FPS with upscaling to 4k. I can guarantee you, if you run Jedi Survivor on PC at 720p 60FPS, you don't need 16GB of VRAM.
2 weeks ago
Anonymous
16gb minimum is nonsensical and not based on any real number, it's completely arbitrary based on the fact that consoles have that much shared memory
you can very easily have a game on PC that with a lot of RT effects will run out of vram on a 16gb card
you can also have a console port that runs perfectly fine on a 10gb card with memory to spare
have people forgotten that gen 8 had 8gb ps4 and 5gb/8gb xbox one s/x? yet games had no problem running on 6gb and even 4gb cards for the longest time
it all depends on a game and the port, one cannot make a blanket statement like your gpu needs 16gb because that's what consoles have
memory bandwidth and latency, both ram and vram, will be the bottleneck far more often on PC than the vram pool
>under 300
Finally, an actual replacement candidate for my RX570.
You are the retarded for listening to the k*kes tricks about 8GB vram "is not enough"
You need more if you are doing some extreme cgi stuff in professional capacity, or you are an AI enthusiast, but for games, going above that is pretty pointless.
>going above that is pretty pointless. >He says while games are wanting 12+GB because of the PS5 and being unoptimized as shite.
I mean I love the price, but ya'll are crazy if you think 8GB now a days is enough. My 1060 is 6GB and even THAT is starting to feel long in tooth.
>they have +24gb shared graphics mem
Anon, consoles only have 16 GB total, and only ~14 of that is available for devs, meaning maybe 10-12 GB available as VRAM depending on the game.
also meaning, pc gamers will shit on new games because not enough ram meaning a HUGE downfall in sales on pc. this is gon be gud bois. it will be pc-geddon for derpsole games.
now is our chance to wreck this shit! no more shitty derpsole ports possibru! yesssss
Someone in the comments on the article mentioned how other gpu manufacturers could increase the vram size on their model. Are there precedents that would give this claim credibility?
>In 2023 >With BOTH companies conspiring together >With resellers and flippers around every corner
You honestly think the internet wouldn't be cleared of a 16GB model in a week?
>fall for the meme AMD card >drivers fucking suck on windows >fall for the linux meme, that makes running regular games a battle >"old games are better anyway" cope meme >put actual effort to run modern titles, and when they run, they run well because dxvk is a god >but they actually fucking suck if compared to the old shit you was playing previously, with the difficulty of getting it to run being directly proportional to how the game sucks ass and disappoints you >basically only play games that could easily run on the igpu, now being sure the AMD card was a waste of money
>linux meme, that makes running regular games a battle
The only difference between linux and windows compatibility these days is the amount of driver hacks for old apis on windows. Virtually all dx11+ games work out of the box now.
AMD is fine if you're a normie, but if you try to go deeper into gaming like emulation, or AI you will find that everything works but with some small annoyances or glitches, these annoyances start to build up slowly on your patience and system.
For me it's just not worth it, I used to own AMD cards myself but not anymore.
>unspecified reason
the 7600 seems to perform only slightly better than the 6650xt which already costs less than 300 and with the 4060 releasing nobody would buy amd if they costs the same
Will one of these companies just release another sub-$300 GPU with more than 8 GB of RAM?
To this day the only GPU that fills that criteria is the RX 6700, and it only just recently hit sub-$300.
The 1080 Ti was such a great product, Jesus Christ. Served me well all the way up to recently, when I switched a 4k monitor and RTX 4080, and only because I had some extra money because I didn't go on any vacation trips in 2020 and 2021 because of the stupid scamdemic.
I want the old Nvidia back, the Nvidia who made the GTX 1000 series.
they need a generation of humbling to lower the prices to where they belong
4000 series cards aren't selling out instantly in an era of increased pc gaming and AI generated shit so hopefully they tone it down next generation but fuck nvidia
even the GTX 1080 was great. just swapped mine out for a 7900 xtx, because fuck nvidia. hell, even the GTX 1080 could play most of the games i played @ 4k/medium. 1080p it would still be completely adequate. (Hell, it's sitting in my eGPU enclosure for use on the laptop come vacation time later this summer)
>drivers
i've had absolutely no issues with game crashing or other software issues in the 2 months i've had it. (with one notable exception, a lack of airflow was routinely making the card hit the 110 junction max; and RDR2 barfed a few times as a result of the throttling. After getting a $20 case fan, it's been flawless.)
>a lack of airflow was routinely making the card hit the 110 junction max
What 7900 XTX do you have? Reference? Reference 7900 XTX had an issue with faulty vapour chambers causing the Tj to spike into oblivion.
At wattage did you hit 110 Tj? Even with complete dogshit airflow I can't see the card getting that hot below 350W unless you have ultra silent fan curve.
considering its barely better than a rx6600 that cost 240 bucks, it makes sense. in fact id rather spend 60 bucks less, the power difference seems so minimal.
I don't know about PCs anymore. I built some little shitty lowrange computer a few years ago and don't know where to go now. May just restart because I'm spooked
B450 + R51600 + 1060 6B is holding well, but now SF6 is showing me cracks.
>vidya graphics haven't improved in 5 years >barely improved over the 5 years before that >but OMG NEW GRAPHICS CARDZZZZ!!!!1!
are you niglets just burning out your cards with overclock dick flexing, or what?
>7800 16GB
I just got goosebumps thinking about my old 7800 GT (256 MB VRAM) from 2005 rising from the ashes as a 16 GB beast. it's a shame nvidia is doing the x0y0 naming scheme now instead of xy00, or we could have a "RTX 7800 GT" for nostalgia's sake soon. I wonder if that will ever be a thing, nostalgia-themed GPUs, like those sports cars that look like a classic but have a sick modern engine under the hood.
could be the first gpu with a non-shit fps/$ ratio since before quarantine.
either way, i got a 6600 in late 2021 and i won't be upgrading until next gpu generation.
>amd so it's shit support from the get go and it's not even their fault >8GB for shit slop games that ask for more and more
at least the price is nice but then again you remember point uno which makes any amd gpu instant loss, fucking nvidia.
I am planning on getting a 6800XT next month bros, anyone had experience with it?
This will be my first AMD card since I am tired of the greedy shitters Nvidia continues to be. I always hear memes about no drivers/shit drivers for AMD, is that true for 6800XT as well?
It seems like amd's previous gen cards had some driver issues, but the 6000 gen as a whole is pretty solid. I got the asus 6800xt a few months ago (upgraded from an rtx 2080) and it's pretty good, I've zero issues whatsoever
I've heard Saphire is the way to go when getting AMD, allegedly they only do AMD cards and are noted as the the go-to for cards. Wonder if that is just marketing though.
Saphire is just turboshit went it comes to fans and their dedicated software manages to be even more shit. They're basically masters at cutting corners.
In my experience, Sapphire, Gigabyte, and MSI all make decent AMD GPUs.
XFX can be hit or miss (though personally all the XFX GPUs I've had were pretty solid), and Powercolor I've only ever heard bad things about, but I'm not sure.
Red devil is hit or miss cause Ive been hearing the 7900XTX models are starting to heat up more. Hellhound is their best product but is limited to RGB colors, performance wise its great I never hit temps over 45 C. I heard XFX non reference is the best value, not sure about their other products
Saphire is just turboshit went it comes to fans and their dedicated software manages to be even more shit. They're basically masters at cutting corners.
So what manufacturer do you recommend anon? I've googled far and wide and everyone praises Saphire, not seeing much turboshitter claims from people who talk about it outside 4chins.
>only problem is windows update might randomly decide to replace your manually installed AMD drivers with old ones
Any way to circumvent this? I don't really plan on streaming anything, I just wanna buy a good card to future proof myself for awhile. Going from 1060 to this will be a big leap.
>Any way to circumvent this?
OO Shut Up 10 or "Ten", bing it for the irony.
free, allows you to turn off any windows "feature" you want to. the only hassle is if you allow updates to install some settings might be changed so after a windows update you have to check your settings. it even notices such changes though and lets you quickly revert or accept them in bulk.
at this point nvidia has slightly worse drivers than amd, at least for last gen cards. only problem is windows update might randomly decide to replace your manually installed AMD drivers with old ones. this has happened to me once in two years with my RX 6600.
if you're streaming, just use software x.264 instead of the AMD encoder.
otherwise it's been all peachy.
>at this point nvidia has slightly worse drivers than amd, at least for last gen cards
AMDfag delusion knows no bounds, no gravity, nothing to keep it down to Earth.
Using 6600xt since November. Works fine. The only thing i've noticed, is If you do UV and see a black screen it means that you need to add some more milliamps. It won't crash the driver outright like nVidia would, so getting stable UV requires actual usage for a while to see if it's perfectly stable
...if you can get the game to start and not crash or produce artifacts of some kind. pretty sure there are games that literally don't support the Vega series anymore despite supporting GPUs older than Vega. Vega was a weird phase for AMD.
probably not, most 1080ti's arent new or unused, its newer hardware and software so its probably gonna perform better. Mind you this is for low end budget gaming and essentials, anyone with a brain knows 8GB isnt enough for modern games today
256 bit bus width on the 3060ti compared to all the new cards being at just 128. It's actually better performance for less money compared to the 4060 series.
a lot of it is people expecting to play unfinished and broken goyslop day 1 with every bell and whistle turned on in the graphics menu
but i don't think you *have* to do that, needless to say
nowadays pc gaymers seem to call everything "obsolete"
1:1. You won't notice a difference if your game runs at 90fps on a 60hz monitor. The only advantage of high framerate in that case would be stability. If you want to make full use of a 144hz monitor your system needs to provide a consistent >= 144fps rate
In fluidity? No. But again PC and consoles have different building parts which affect the stability and average performance of the system. Depending on how much money you throw at a PC, you won't have frame drops compared to consoles which have to make the most of the tools they have and intelligently allocate resources. To answer your question, on paper they are the same but in practice on a console you will have troubles running at 120hz all the time
exactly the same assuming they both maintain 120fps and have consistent frame time
on console on one hand you don't have shit in background but on the other hand cpu is shit so weird little things can cause bug spiked
on pc the cpu is much beefier so spikes are lower BUT they can be introduced more often because if background apps and the OS shitting itself be it windows or loonix
also games that don't generate shader cache on pc on launch WILL stutter while almost every console game in existence ships with shader cache
so basically choose your poison but in 99% of cases it'll be identical and I am kinda splitting hairs here
VRAM is adapted to the power of the GPU. Whats the point of more VRAM if the GPU cant display a game that needs 10 GB ?
THe VRAM is perfectly adapted to the capacity of this GPU.
The only reason it has 8GB is because it's not very powerful, not more than older cards. THe point of this useless card is for nvidia to make some free money and test the waters with the >its a shit gpubut look at the framerate with DLSS on goy? its not that powerful but if you just make an approximation of the pixels then it runs alright
thing and how the consumers bite it.
So in 2023, a $300 GPU is still way weaker than the GPU of a fucking 2019 console??? >its a shit gpubut look at the framerate with DLSS on goy? its not that powerful but if you just make an approximation of the pixels then it runs alright
Retard, current gen games are unplayable on 8GB gpus. DLSS or not.
nvidia will just shift the stack up a tier so lower cards will get the better sku like 5070 getting the full xx104, they will add more vram and keep prices exactly the same
and people will love it and say it's value and buy the shit out of it like they did with Ampere
nvidia has been doing the "flip flop" for over half a decade now
AMD is an american company run by a taiwanese woman.
Nvidia is an american company run by a taiwanese man.
Intel is an american-israeli company run by garden gnomes
Intel still has the best architecture. AMD is clever by using stacked L3 cache but Zen4 is still inferior to Raptor Cove and honestly even Golden Cove since Raptor Cove is just GC with extra L2 and no ring latency penalty when going between E-cores and P-cores. And don't forget that Intel is a full node behind and in Intel's defense, TSMC is sponsored by Taiwanese goverment and gets big bucks from Apple with Intel has none of these benefits and to their credit, they used to be two nodes behind until a while ago so at least there's some progress lol
Intels architecture is old and outdated.
The only thing they master is making it survive intense heat and powerdrain.
AMD chips can do more with less but you can't put them through hell without killing them.
Intel only managed to keep up with AMD in recent years by allowing their chips to bruteforce the benchmark with sheer energy.
AMD easily win any value and efficiency battle but Intel know that enough people will just buy the winning chip - And if that's Intel with 1-2% more performance at 30% the energy cost and twice as hot they'll still buy that.
Intel must be scared shitless. CPU's have reached a point where outside of very, very specialized use cases there's no real reason to upgrade even an 8+ year old cpu.
video cards are quickly approaching that point as well.
i'm guessing that the future will lean way more towards SaaS offerings to maintain the paypiggie revenue streams.
>a lack of airflow was routinely making the card hit the 110 junction max
What 7900 XTX do you have? Reference? Reference 7900 XTX had an issue with faulty vapour chambers causing the Tj to spike into oblivion.
At wattage did you hit 110 Tj? Even with complete dogshit airflow I can't see the card getting that hot below 350W unless you have ultra silent fan curve.
XFX merc 310; yeah i read about that heat issue, in my case it seemed like a legitimate airflow issue, after installing the fan it hasn't gone above 95c and all is well in the world.
yeah, for now.
they're losing the server market to AMD, consoles are all AMD, consumer purchases are slowing down, and they can't make a decent GPU to save their lives.
dunno, long term you don't see them being knocked down a peg or two?
2 weeks ago
Anonymous
>yeah, for now. >they're losing the server market to AMD
2 more years 🙂 >consoles are all AMD >they can't make a decent GPU to save their lives.
Neither of these have ever been a market for Intel in the past. >dunno, long term you don't see them being knocked down a peg or two?
Because they aren't lmao. Only a delusional retard things AMD is going to make Intel go bankrupt or out of the game. Intel has always make all of its money via OEM and still continues to do so. AMD will never be anywhere close in comparison.
2 weeks ago
Anonymous
>and they can't make a decent GPU to save their lives.
I dunno wtf you're talking about but the Intel Arc cards are pretty good value cards right now. Probably the best bang-for-buck if you're on a tight budget.
I'm cautiously optimistic about Intel Battlemage.
2 weeks ago
Anonymous
i'd like to see intel do well with their GPU efforts too. Having some more competition and competition on price is nothing but an upside for consumers.
i could be wrong, but aren't they on par with like a 2060 performance wise?
>Intel must be scared shitless
Intel is FUCKED. Krzanich fucked up big time and set Intel foundries back 4 years. Swan did basically fucking nothing and Gelsinger is beyond incompetent (as a CEO, he's a brilliant engineer) by focusing on things that don't matter. Their entire GPU division is a massive money sink that will not see dividends, ever. AMD is destroying them in the server market because of MCM despite Intel having better uarch than them. Their shares took a massive nosedive in the past few years.
Unless they change the strategy they will never match AMD in the server market and desktop market is pennies in comparison. I have no idea why they entered the GPU business, it should be beyond obvious that no ever will catch to nvidia. Not even Apple with their unlimited money hack, best talent in the industry and priority access to TSMC's latest node managed to.
Do you people even know what you're talking about?
Unless they change the memory bus and thus make it a fundamentally different GPU they can't give us 12GB on a hypothetical XT model. They could double the 8GB and give us 16GB on a hypothetical RX 7600 XT, but not 12GB.
The only games that need more than 8GB of VRAM minimum are shitty games that aren't worth playing anyways. Graphics gayry is inversely proportional to how good/fun a game actually is.
>but what about muh Hogwartz Legacy, Cyberplunder and TLOU Part 2
If you care about those trashfires you belong on /r/gaming. Not Ganker.
All the games that shat the bed at 8 GB VRAM or lower were western AAA trash of the worst caliber.
The only games that need more than 4GB of VRAM minimum are shitty games that aren't worth playing anyways. Graphics gayry is inversely proportional to how good/fun a game actually is.
>Muh shared memory
All the games shitting the bed were on PS4 as well. That argument doesn't even matter. The games were just incredibly shitty ports and the reason why we keep getting those isn't the VRAM no longer being enough - It's publishers not giving their devs enough time to optimize anything - they just throw everything onto the market to hit the numbers for their next financial report.
While some ports are indeed inexcusably bad (looking at you, Jedi Survivor), I do think devs get more flack than they deserve. The fact of the matter is that optimizing for PC is much harder than for consoles, especially when Ngreedia and AMD keep being stingy with their VRAM and gamers refuse to upgrade their hardware because the newest hardware is too expensive.
Another party to blame here is the Unreal Engine 4. We shouldn't forget about that clusterfuck of an engine. The one thing that engine had going for it is that it made game development easier than ever, yet made optimizing games more difficult than ever. The Unreal Engine 4 is really starting to show its age, and simply cannot handle what devs are trying to do now in 2023, neither on PC nor on console.
Seriously, we often bitch about PC ports being garbage, but most of these games do not run any better on consoles.
Now of course the simple solution would be to just make games less graphically demanding, but then you'd have idiot gay zoomers complain that the game looks like crap, despite their shitty PC not being able to run the game on 'max' graphics anyway.
>AMD broke CPU stagnation and forced core counts and IPC to go up while maintaining affordable pricing, everyone won >AMD does fuck all in the GPU space and it's still stagnant
>8GB
Are they retarded
You are the retarded for listening to the k*kes tricks about 8GB vram "is not enough"
Nice Dunning-Kruger replies
Aren't 8gigs OK if you have PCIe 4 for your RAM?
No. The newest GPUs don't even use PCIe 5.0, not even the RTX 4090. They work on a PCIe 5.0 motherboard because those boards are backwards compatible (for GPUs, not CPUs) but the GPU won't perform any differently on the PCIe 5.0 board versus the PCIe 4.0 board.
Watch this video if you want a super in depth explanation: https://www.youtube.com/watch?v=v2SuyiHs-O4
I have an 8gb 1080 and I can tell by how textures load in late that it isn’t enough anymore
Not that the anon you're replying to is correct (inherently), but the reason they take so long to load is every dev has completely removed all precaching, or they try to bake it in at menus and just... stop doing it when you load in, to load as you go (leading to HORRENDOUS popin)
more vram is beneficial in this, but by and large, it's beneficial at higher res. 8gb is enough for 1080p, but 1080p is the clueless normalfag's res of choice, now.
Guarantee you barely even knew what vram was 2 years ago
yeah, cause games weren't crazy vram dependent. now 12gb of vram is required to run 30fps
What changed about games that they need VRAM now?
Console game ports have ridiculously big textures for some reason.
VRAM size has very little to do with framerate. you either have enough or you don't. if you don't, textures stop loading or the game starts stuttering, NOT running consistently at a lower framerate.
in reality the issue is solved by reducing the texture setting, the render resolution, lowering shadow resolution, things like that. and before devs started relying on the 9th gen consoles as their new target hardware we never had any problems with 6GB VRAM, let alone 8GB. it's just lazy console ports again, multiplat gaming never changes.
Name one game.
8 gb is still ok for 1080p
which is the point of this card
GTX 750ti was "still ok for 1080p"
Demands have risen. 8gb is not okay.
Studios put out unoptimized piles of shit so nvidia can market overpriced weak video cards all thanks to fucking dlss which is what they use to measure and market their newer video card's performance. The only two cards worth buying from that line up is the 4080 or 4090 and even then they're still overpriced to hell and back.
My rx480 had 8gb of vram 7 years ago
There's absolutely zero reason newer cards should release with the same amount of vram in 2023
no game came close to utilizing the full 8gb in 1080p 7 years ago
while I do agree there should be some scaling up , I think inflation are hitting their big margin and they still want to keep them , That's why they are skimping on vram
deus ex Mankind divided and FFXV did
The base requirements move upwards, therefore the overhead should as well.
Yeah, but there are games (and not shitty disgusting ports like TLOU) that already require more than 8vram for 1080p, like Forza Horizon 5.. and I don't remember what other game I saw that exceeded 8... but if forza, which is a game from a few years ago, requires more than 8GB, why isn't it unreasonable to think that the death of 8GB is near?
>2023
>1080p
Lol
shills with lowlife manipulation tactics, calculated people like you might lose their soul, hope it was worth it and you re "happy"
I can definitely tell the difference between 1080p and 1440p. Framerates after 60 Hz are a placebo though, so 60 Hz is the same as 144 Hz.
[email protected] > [email protected]
k, you get a pass. for now.
> 60 Hz is the same as 144 Hz.
Shut your trap you are blind
>Framerates after 60 Hz are a placebo though
you're either playing with a controller or you've got learning disabled levels of hand-eye coordination.
To be honest I cant tell the difference between 60hz and 144hz when gaming. Going from 60hz to 200+Hz might be a different story.
>60hz and 144hz are the same
Are you blind or something?
probably tested 144fps on a 60hz display OR the game didn't reach 144fps on a 144hz panel
or a very slow paced game with camera panning slowly
or just very obtuse and lying
to be genuinely unable to notice higher fluidity due to better refresh rate your brain would literally, not for emphasis, need to be broken and your kinetic vision so shit you'd barely be able to function irl
>60 Hz is the same as 144 Hz
Bait, but whether it's a meaningful improvement depends on the game. FPS or fightan? Sure. Third person cover poker? Not really.
yeah, for low and medium texture settings...
the consoles have 16gb, and most games are cross platform, 8gb is 2x less than what we should be getting, simple as that
>the consoles have 16gb, and most games are cross platform, 8gb is 2x less than what we should be getting, simple as that
You fail to mention that the 16GB of RAM that consoles have is shared RAM, meaning both the CPU and GPU use it, while in PC we have DRAM for the CPU and VRAM for the GPU.
You're not entirely incorrect, though, because the shared RAM in consoles is faster than the DRAM in PCs, and the shared RAM also means consoles only need to load assets and instructions into the RAM only once, where as with the PC they need to load a lot of assets/instructions into the RAM twice, once into the DRAM and once into the VRAM. Because of that, the PC needs more combined RAM (DRAM + VRAM) than the consoles to perform the same, but you wouldn't need 16GB of VRAM to get the same performance as consoles, as certain instructions are only needed by the CPU and other instructions are only needed by the GPU.
The "actual" vram of the consoles is 10gb. Hence why 8gb owners are completely fucked.
That's not completely accurate either. The shared 16GB of RAM on consoles can be divided between the CPU and GPU however the game engine sees fit. In practice the GPU often needs more RAM than the CPU, though, so it wouldn't surprise me if indeed over half of that 16GB of RAM on consoles is used by the GPU.
the problem is that both the cpu and gpu access the same pool of memory
on pc you have a bottleneck by having to put things into ram and then into vram that's why ram overclocking makes such a massive difference when it comes to 1% lows and stutters
>The unit ships with 16 GB of GDDR6 SDRAM, with 10 GB running at 560 GB/s primarily to be used with the graphics system and the other 6 GB at 336 GB/s to be used for the other computing functions. After accounting for the system software, about 13.5 GB of memory will be available for games and other applications, with the system software only drawing from the slower pool
Sauce? Because none of this info adds up to what I learned/found about the PS5. Every source I found claims it has 16GB of GDDR6 SDRAM running at 448 GB/s with a 256-bit memory bus width, and then it has another DDR4 RAM chip of only 512 MB for background tasks. the 16GB of GDDR6 SDRAM is used by games, and how it divides up that RAM between GPU and CPU tasks is completely up to the game engine.
That's the wikipedia page of the xbox series x, so maybe the ps5 is indeed different
>so maybe the ps5 is indeed different
It is, only XSX has split memory bandwidth and some suspect it's the reason why it's not performing as well as PS5 despite being faster on paper. The bandwidth is honestly kinda shit even without the split.
512GB/s can't fully feed a 60 CU RX 6800 and it needs L4 cache yet Xbox Series X has 560/336GB/s without L4 feeding 52 CUs. And consoles are targeting resolution over framerate so the bandwidth matters more. 4070 can sorta get away with 500GB/s because nvidia has MUCH better memory compression than AMD and you're also not expected to run games at 4k with a 4070.
https://www.eurogamer.net/digitalfoundry-2020-inside-xbox-series-x-full-specs
He's talking about the Series X.
Afaik the PS5 has 2GB of the 16 reserved for the OS.
Series X is split into a faster pool of 10GB and a slower pool of 6GB. Out of those 6GB I think 4 are presently available for games (I remember hearing a while back that MS had freed up several hundred megabytes for use in games on Series S and X).
Series S is split into 8GB of faster memory and 2GB of slower memory. The 2GB are used by the OS and the 8 of faster memory are used for games, so games don't use anything from the slower memory pool.
>You fail to mention that the 16GB of RAM that consoles have is shared RAM, meaning both the CPU and GPU use it, while in PC we have DRAM for the CPU and VRAM for the GPU.
dosnt matter, as you said the gpu ram bandwith is too low on the pc to use anything out of ram for the current frame, so it has to store the same info twice, once in gpu, once in ram
and sure the consoles system eats up 3gb or what ever, and you need to preload shit you dont need right now to another 3gb, its still not 8gb, modern gpus need 12gb to match console ports, and since most of the run with upscaling and other gimmick native render on pcs need 16gb due to more frame buffer usage
16gb is the minimum middle range gpus should have right now
You more or less repeated what I said and yet you still fail to see the point.
I can guarantee you that unless you're dealing with an absolutely garbage port, you do not need 16GB of VRAM in order to run a game at exactly the same graphical settings as what the consoles provide.
Hell, even with shitty ports, you still don't need 16GB of VRAM if all you want to do is MATCH the looks and performance of a console. Look at Jedi Survivor, for example. Shit port, needs 16GB of VRAM to run it on max graphics 4k 60FPS, but that's not how the consoles run that game. The consoles run that game at ~720p 60FPS with upscaling to 4k. I can guarantee you, if you run Jedi Survivor on PC at 720p 60FPS, you don't need 16GB of VRAM.
>you do not need 16GB of VRAM in order to run a game at exactly the same graphical settings as what the consoles provide.
you mean silky smooth cinematic 20fps and upsacaled 720 with DRS and checkerboard?
most pcs run at higher res, which is a lot more vram usagem then you have just basic architecture overhead you want to have, truth is that no matter the engine or dev, next gen titles ported to pcs run like shit on 8gb
be it the new RE4remake on camcop inhouse negine, unreal 5 titles like that dead space clone, flopspoken or square engine, or last of us remake, they all struggle on 8gb and you need to lower the settings ( there are also cpu problems, but thats another shitshow )
Okay but that's all besides the point. This anon here
said 16GB of VRAM is the minimum we should get because that's what consoles have, which makes no sense and is based on absolutely nothing.
>based on absolutely nothing.
its based on a 3 year old console based on 2 generation old gpu architecture
if they have 16gb, why would the "next next gen" gpus only have 8gb?
I'll refer you back to my original reply as opposed to typing it all over again:
and yet you are wrong, as proven by every single big AAA cross platform release, not only is 8gb not enough, 12gb struggles at 4k
I am not wrong, and I already debunked this too, here:
16gb minimum is nonsensical and not based on any real number, it's completely arbitrary based on the fact that consoles have that much shared memory
you can very easily have a game on PC that with a lot of RT effects will run out of vram on a 16gb card
you can also have a console port that runs perfectly fine on a 10gb card with memory to spare
have people forgotten that gen 8 had 8gb ps4 and 5gb/8gb xbox one s/x? yet games had no problem running on 6gb and even 4gb cards for the longest time
it all depends on a game and the port, one cannot make a blanket statement like your gpu needs 16gb because that's what consoles have
memory bandwidth and latency, both ram and vram, will be the bottleneck far more often on PC than the vram pool
>8 gb is still ok for 1080p
No it's not. Stop lying you stupid poorfag.
>under 300
Finally, an actual replacement candidate for my RX570.
You need more if you are doing some extreme cgi stuff in professional capacity, or you are an AI enthusiast, but for games, going above that is pretty pointless.
>going above that is pretty pointless.
>He says while games are wanting 12+GB because of the PS5 and being unoptimized as shite.
I mean I love the price, but ya'll are crazy if you think 8GB now a days is enough. My 1060 is 6GB and even THAT is starting to feel long in tooth.
What games need 12gb+ vram?
Poorly optimized games. Which is the entire point of upgrading your 8gb gpus
the last of us. and every other new game coming from derpsoles. they have +24gb shared graphics mem, meaning pc cards will all suck at the new games.
>they have +24gb shared graphics mem
Anon, consoles only have 16 GB total, and only ~14 of that is available for devs, meaning maybe 10-12 GB available as VRAM depending on the game.
>he doesn't know
also meaning, pc gamers will shit on new games because not enough ram meaning a HUGE downfall in sales on pc. this is gon be gud bois. it will be pc-geddon for derpsole games.
now is our chance to wreck this shit! no more shitty derpsole ports possibru! yesssss
hogwarts legacy performs badly with 8 GB.
>"professional" cgi or ai stuff
>using amd for that at all
lol
needs at least 16gb. this is a joke.
Someone in the comments on the article mentioned how other gpu manufacturers could increase the vram size on their model. Are there precedents that would give this claim credibility?
>need 16GB for movie games
Look if you want to play movie games just buy a console. Seriously.
>In 2023
>With BOTH companies conspiring together
>With resellers and flippers around every corner
You honestly think the internet wouldn't be cleared of a 16GB model in a week?
bro it's the weakest card in the lineup
I'm not retarded so 8gb is fine, your 4K gayshit might struggle though
coping poorfag
Camel, eye of the needle, etc.
>haha take that Nvidiots!!
>*driver crashes*
Works in my machine
I know this pain due to having the misfortune of owning a 5700xt. it's better to pay more than have your pc use feel like crap.
Sucks to be using windows.
>fall for the meme AMD card
>drivers fucking suck on windows
>fall for the linux meme, that makes running regular games a battle
>"old games are better anyway" cope meme
>put actual effort to run modern titles, and when they run, they run well because dxvk is a god
>but they actually fucking suck if compared to the old shit you was playing previously, with the difficulty of getting it to run being directly proportional to how the game sucks ass and disappoints you
>basically only play games that could easily run on the igpu, now being sure the AMD card was a waste of money
I can't parse this post through all the ESLness.
I can.
>linux meme, that makes running regular games a battle
The only difference between linux and windows compatibility these days is the amount of driver hacks for old apis on windows. Virtually all dx11+ games work out of the box now.
AMD is fine if you're a normie, but if you try to go deeper into gaming like emulation, or AI you will find that everything works but with some small annoyances or glitches, these annoyances start to build up slowly on your patience and system.
For me it's just not worth it, I used to own AMD cards myself but not anymore.
AMD Emulates fine friend.
Let's ignore all of those errors with Tears of the Kingdom that only happen on AMD.
Works on my machine.
>For European gamers this means €299
To me it looks like they were both planned to be 299 $ and € but finally added a $30 discount on US prices for some unspecified reason
>unspecified reason
the 7600 seems to perform only slightly better than the 6650xt which already costs less than 300 and with the 4060 releasing nobody would buy amd if they costs the same
my 3060 Ti is better than that shit, who cares
Will one of these companies just release another sub-$300 GPU with more than 8 GB of RAM?
To this day the only GPU that fills that criteria is the RX 6700, and it only just recently hit sub-$300.
Never, it's their goal to actively avoid releasing a quality budget product. Customers want a repeat of the 1080ti, but manufacturers don't.
The 1080 Ti was such a great product, Jesus Christ. Served me well all the way up to recently, when I switched a 4k monitor and RTX 4080, and only because I had some extra money because I didn't go on any vacation trips in 2020 and 2021 because of the stupid scamdemic.
I want the old Nvidia back, the Nvidia who made the GTX 1000 series.
they need a generation of humbling to lower the prices to where they belong
4000 series cards aren't selling out instantly in an era of increased pc gaming and AI generated shit so hopefully they tone it down next generation but fuck nvidia
even the GTX 1080 was great. just swapped mine out for a 7900 xtx, because fuck nvidia. hell, even the GTX 1080 could play most of the games i played @ 4k/medium. 1080p it would still be completely adequate. (Hell, it's sitting in my eGPU enclosure for use on the laptop come vacation time later this summer)
>drivers
i've had absolutely no issues with game crashing or other software issues in the 2 months i've had it. (with one notable exception, a lack of airflow was routinely making the card hit the 110 junction max; and RDR2 barfed a few times as a result of the throttling. After getting a $20 case fan, it's been flawless.)
>a lack of airflow was routinely making the card hit the 110 junction max
What 7900 XTX do you have? Reference? Reference 7900 XTX had an issue with faulty vapour chambers causing the Tj to spike into oblivion.
At wattage did you hit 110 Tj? Even with complete dogshit airflow I can't see the card getting that hot below 350W unless you have ultra silent fan curve.
my sapphire 6600 is keeping me comfy bros
How does it compare to their other cards? I was planning on upgrading from a 6600xt
This is not a suitable upgrade from a 6600 XT.
We won't see that in the sub-$400 price-class until like 2026.
Gotcha. Thanks anon
Anon from the future it seems
considering its barely better than a rx6600 that cost 240 bucks, it makes sense. in fact id rather spend 60 bucks less, the power difference seems so minimal.
I don't know about PCs anymore. I built some little shitty lowrange computer a few years ago and don't know where to go now. May just restart because I'm spooked
B450 + R51600 + 1060 6B is holding well, but now SF6 is showing me cracks.
>SF6
So a PS4 game did you in, huh?
you can upgrade the cpu and gpu and have a pretty strong gaymen pc
the 1060 may hold, you'll need new CPU and that means new mobo and new ram unfortunately. If you don't get something used, it'll easily cost you $300
>replace the whole pc
no shit
>replace the whole pc
no, just the cpu and gpu. You can keep the cpu cooler, mobo, ram, psu and case
5800X3D for sure. GPU it depends, the 7600 might be good.
>vidya graphics haven't improved in 5 years
>barely improved over the 5 years before that
>but OMG NEW GRAPHICS CARDZZZZ!!!!1!
are you niglets just burning out your cards with overclock dick flexing, or what?
Bruh I need a 4090 to max out the latest shitty console port I wasn't going to play anyways.
Resolution and Hz size on the other hand...
But GPUs have generally been a terrible deal for nearly a decade now and it's only getting worse.
if you want a 1080p card just buy the arc 750 for $199
Thank you greatest ally Intelaviv.
Witches. Witches in video games.
the ai is too fucking strong
whats the Ram size for 7700 and 7800?
12, 16
>7800 16GB
yeah so unless it performs better than the 6800 and costs less I think I may just commit to the 7900XT
>7800 16GB
I just got goosebumps thinking about my old 7800 GT (256 MB VRAM) from 2005 rising from the ashes as a 16 GB beast. it's a shame nvidia is doing the x0y0 naming scheme now instead of xy00, or we could have a "RTX 7800 GT" for nostalgia's sake soon. I wonder if that will ever be a thing, nostalgia-themed GPUs, like those sports cars that look like a classic but have a sick modern engine under the hood.
could be the first gpu with a non-shit fps/$ ratio since before quarantine.
either way, i got a 6600 in late 2021 and i won't be upgrading until next gpu generation.
oh boy more ESL "gamers" on the horizon, can't wait
it's slower than 6600 xt
Its the same than the 6650XT but $100 less.
AMD keeps getting better and better every generation, the only thing left is to add proper CUDA and NVENC alternatives.
>amd so it's shit support from the get go and it's not even their fault
>8GB for shit slop games that ask for more and more
at least the price is nice but then again you remember point uno which makes any amd gpu instant loss, fucking nvidia.
people say vram isnt needed in 1080P but so much games have shit textures everywhere because of optimization requirements...
I am planning on getting a 6800XT next month bros, anyone had experience with it?
This will be my first AMD card since I am tired of the greedy shitters Nvidia continues to be. I always hear memes about no drivers/shit drivers for AMD, is that true for 6800XT as well?
It seems like amd's previous gen cards had some driver issues, but the 6000 gen as a whole is pretty solid. I got the asus 6800xt a few months ago (upgraded from an rtx 2080) and it's pretty good, I've zero issues whatsoever
>asus 6800xt
I've heard Saphire is the way to go when getting AMD, allegedly they only do AMD cards and are noted as the the go-to for cards. Wonder if that is just marketing though.
Saphire is just turboshit went it comes to fans and their dedicated software manages to be even more shit. They're basically masters at cutting corners.
In my experience, Sapphire, Gigabyte, and MSI all make decent AMD GPUs.
XFX can be hit or miss (though personally all the XFX GPUs I've had were pretty solid), and Powercolor I've only ever heard bad things about, but I'm not sure.
Red devil is hit or miss cause Ive been hearing the 7900XTX models are starting to heat up more. Hellhound is their best product but is limited to RGB colors, performance wise its great I never hit temps over 45 C. I heard XFX non reference is the best value, not sure about their other products
I see, will check my options then. Thanks anon.
So what manufacturer do you recommend anon? I've googled far and wide and everyone praises Saphire, not seeing much turboshitter claims from people who talk about it outside 4chins.
>only problem is windows update might randomly decide to replace your manually installed AMD drivers with old ones
Any way to circumvent this? I don't really plan on streaming anything, I just wanna buy a good card to future proof myself for awhile. Going from 1060 to this will be a big leap.
>Any way to circumvent this?
OO Shut Up 10 or "Ten", bing it for the irony.
free, allows you to turn off any windows "feature" you want to. the only hassle is if you allow updates to install some settings might be changed so after a windows update you have to check your settings. it even notices such changes though and lets you quickly revert or accept them in bulk.
>So what manufacturer do you recommend anon?
Asus, msi, gigabyte
I've had a 6800 non-XT for 3 or 4 months now. No problems here, it just works.
amd has and always will have issues with drivers. it's something you have to accept if you decide to go with amd
at this point nvidia has slightly worse drivers than amd, at least for last gen cards. only problem is windows update might randomly decide to replace your manually installed AMD drivers with old ones. this has happened to me once in two years with my RX 6600.
if you're streaming, just use software x.264 instead of the AMD encoder.
otherwise it's been all peachy.
>at this point nvidia has slightly worse drivers than amd, at least for last gen cards
AMDfag delusion knows no bounds, no gravity, nothing to keep it down to Earth.
AMD drivers are fine nowadays, especially on Linux if you care about that shit.
Using 6600xt since November. Works fine. The only thing i've noticed, is If you do UV and see a black screen it means that you need to add some more milliamps. It won't crash the driver outright like nVidia would, so getting stable UV requires actual usage for a while to see if it's perfectly stable
>AMD Won
>only difference is $30
lmao are americans this poor
Reminder that Vega 56 was 200$ in 2018, had 8gb of HBM2 RAM and probably performs the same 7600
Lol
Lmao
...if you can get the game to start and not crash or produce artifacts of some kind. pretty sure there are games that literally don't support the Vega series anymore despite supporting GPUs older than Vega. Vega was a weird phase for AMD.
Vega is still supported and probably will till the end of time since a lot of workstation GPUs use the architecture
>Pooga
is 1080ti to this a good upgrade?
probably not, most 1080ti's arent new or unused, its newer hardware and software so its probably gonna perform better. Mind you this is for low end budget gaming and essentials, anyone with a brain knows 8GB isnt enough for modern games today
Is it better than my 1660Ti.
>nooo 8gb is bad you NEED 128GB OR IT'S BAD
I run 1440p 144fps on a 3060ti without any issues whatsoever
have a nice day shills
256 bit bus width on the 3060ti compared to all the new cards being at just 128. It's actually better performance for less money compared to the 4060 series.
a lot of it is people expecting to play unfinished and broken goyslop day 1 with every bell and whistle turned on in the graphics menu
but i don't think you *have* to do that, needless to say
nowadays pc gaymers seem to call everything "obsolete"
Is the 7900 xt worth buying?
>6 years old card
>399$ on release
>2048bit bus width
I'm still running a 980ti
Same. Except I don't have a job and that if it breaks I am going to kill myself
I'm running a 970 3.5 GB
I've got a 2070 super on 1080p what's the point of moving to a higher resolution?
If you enjoy higher pixel density / image quality and size. Modern games on highest settings may not always hit 144hz though.
>trusting a card that cheap inside your system
also curry tech
I know you are brown.
if its at least near the 6700XT in performance they might have a winner
unless you are still on a 1060 or 580 or older dont bother upgrading
>its ok to buy and support absolute horseshit if its an upgrade
thank you for sharing your eternal wisdom, almighty GPU oracle
no problem
>8gb
ill just wait for the 8600
Can anyone tell me how much FPS it takes to run a 144 or 240 Hz monitor to see the difference?
1:1. You won't notice a difference if your game runs at 90fps on a 60hz monitor. The only advantage of high framerate in that case would be stability. If you want to make full use of a 144hz monitor your system needs to provide a consistent >= 144fps rate
At 120 Hz on the PC and 120 Hz on the console, are there any differences?
Yes. The PC image will look crisper but the console will have better frametimes unless your PC is very overtuned.
In fluidity? No. But again PC and consoles have different building parts which affect the stability and average performance of the system. Depending on how much money you throw at a PC, you won't have frame drops compared to consoles which have to make the most of the tools they have and intelligently allocate resources. To answer your question, on paper they are the same but in practice on a console you will have troubles running at 120hz all the time
exactly the same assuming they both maintain 120fps and have consistent frame time
on console on one hand you don't have shit in background but on the other hand cpu is shit so weird little things can cause bug spiked
on pc the cpu is much beefier so spikes are lower BUT they can be introduced more often because if background apps and the OS shitting itself be it windows or loonix
also games that don't generate shader cache on pc on launch WILL stutter while almost every console game in existence ships with shader cache
so basically choose your poison but in 99% of cases it'll be identical and I am kinda splitting hairs here
VRAM is adapted to the power of the GPU. Whats the point of more VRAM if the GPU cant display a game that needs 10 GB ?
THe VRAM is perfectly adapted to the capacity of this GPU.
The only reason it has 8GB is because it's not very powerful, not more than older cards. THe point of this useless card is for nvidia to make some free money and test the waters with the
>its a shit gpubut look at the framerate with DLSS on goy? its not that powerful but if you just make an approximation of the pixels then it runs alright
thing and how the consumers bite it.
>THe point of this useless card is for nvidia to make some free money
>nvidia
>he doesn’t know
So in 2023, a $300 GPU is still way weaker than the GPU of a fucking 2019 console???
>its a shit gpubut look at the framerate with DLSS on goy? its not that powerful but if you just make an approximation of the pixels then it runs alright
Retard, current gen games are unplayable on 8GB gpus. DLSS or not.
I wonder if it's good for render
Yeah but what is the performance?
>8gb
what a dumb question
Ah yes, a 1080p medium-high card in 2023.
Even on 1080p you will have serious troubles with 8gb gpu.
Again, video games are designed for consoles, wich means that they target 10gp of vram.
That GPU will have no trouble running any non-DS game at 1080p on max/120. Not very future-proofed, though.
>video games
Movie games.
Stop using Ultra settings you idiot
You stupid fuck, you think devs give a fuck about your poorfag gpu???
8gb is unplayable for current gen games.
Wrong. I watched benchmarks on YouTube and it still looks fine to me.
nvidia will just shift the stack up a tier so lower cards will get the better sku like 5070 getting the full xx104, they will add more vram and keep prices exactly the same
and people will love it and say it's value and buy the shit out of it like they did with Ampere
nvidia has been doing the "flip flop" for over half a decade now
>AMD
Sorry I don't buy pajeet shit
AMD is an american company run by a taiwanese woman.
Nvidia is an american company run by a taiwanese man.
Intel is an american-israeli company run by garden gnomes
Intel still has the best architecture. AMD is clever by using stacked L3 cache but Zen4 is still inferior to Raptor Cove and honestly even Golden Cove since Raptor Cove is just GC with extra L2 and no ring latency penalty when going between E-cores and P-cores. And don't forget that Intel is a full node behind and in Intel's defense, TSMC is sponsored by Taiwanese goverment and gets big bucks from Apple with Intel has none of these benefits and to their credit, they used to be two nodes behind until a while ago so at least there's some progress lol
Intels architecture is old and outdated.
The only thing they master is making it survive intense heat and powerdrain.
AMD chips can do more with less but you can't put them through hell without killing them.
Intel only managed to keep up with AMD in recent years by allowing their chips to bruteforce the benchmark with sheer energy.
AMD easily win any value and efficiency battle but Intel know that enough people will just buy the winning chip - And if that's Intel with 1-2% more performance at 30% the energy cost and twice as hot they'll still buy that.
Intel must be scared shitless. CPU's have reached a point where outside of very, very specialized use cases there's no real reason to upgrade even an 8+ year old cpu.
video cards are quickly approaching that point as well.
i'm guessing that the future will lean way more towards SaaS offerings to maintain the paypiggie revenue streams.
XFX merc 310; yeah i read about that heat issue, in my case it seemed like a legitimate airflow issue, after installing the fan it hasn't gone above 95c and all is well in the world.
>Intel must be scared shitless.
They still make like 3x the amount of revenue AMD does
yeah, for now.
they're losing the server market to AMD, consoles are all AMD, consumer purchases are slowing down, and they can't make a decent GPU to save their lives.
dunno, long term you don't see them being knocked down a peg or two?
>yeah, for now.
>they're losing the server market to AMD
2 more years 🙂
>consoles are all AMD
>they can't make a decent GPU to save their lives.
Neither of these have ever been a market for Intel in the past.
>dunno, long term you don't see them being knocked down a peg or two?
Because they aren't lmao. Only a delusional retard things AMD is going to make Intel go bankrupt or out of the game. Intel has always make all of its money via OEM and still continues to do so. AMD will never be anywhere close in comparison.
>and they can't make a decent GPU to save their lives.
I dunno wtf you're talking about but the Intel Arc cards are pretty good value cards right now. Probably the best bang-for-buck if you're on a tight budget.
I'm cautiously optimistic about Intel Battlemage.
i'd like to see intel do well with their GPU efforts too. Having some more competition and competition on price is nothing but an upside for consumers.
i could be wrong, but aren't they on par with like a 2060 performance wise?
>Intel must be scared shitless
Intel is FUCKED. Krzanich fucked up big time and set Intel foundries back 4 years. Swan did basically fucking nothing and Gelsinger is beyond incompetent (as a CEO, he's a brilliant engineer) by focusing on things that don't matter. Their entire GPU division is a massive money sink that will not see dividends, ever. AMD is destroying them in the server market because of MCM despite Intel having better uarch than them. Their shares took a massive nosedive in the past few years.
Unless they change the strategy they will never match AMD in the server market and desktop market is pennies in comparison. I have no idea why they entered the GPU business, it should be beyond obvious that no ever will catch to nvidia. Not even Apple with their unlimited money hack, best talent in the industry and priority access to TSMC's latest node managed to.
But doesn`t the 4060 have 16GB?
No. 4060($300) is 8GB. 4060 Ti($400) is 8GB. There is a 4060 Ti with 16GB but that's $500.
Okay but is it better than my old ass 1070 that still is running games perfectly fine at 144 fps and 1080p?
not really no
That is 100 credits too expensive still
This is the RX 7600
Not a RX 7600 XT.
Wouldn't surprise me if they give us a XT model that is like 30-50 bucks more expensive with 12GB Vram a little later.
Do you people even know what you're talking about?
Unless they change the memory bus and thus make it a fundamentally different GPU they can't give us 12GB on a hypothetical XT model. They could double the 8GB and give us 16GB on a hypothetical RX 7600 XT, but not 12GB.
They can just bin one of the ram dies. 8+4=12 bud
>lets just have two ram segments, what could possibly go wrong
The only games that need more than 8GB of VRAM minimum are shitty games that aren't worth playing anyways. Graphics gayry is inversely proportional to how good/fun a game actually is.
This
>but what about muh Hogwartz Legacy, Cyberplunder and TLOU Part 2
If you care about those trashfires you belong on /r/gaming. Not Ganker.
All the games that shat the bed at 8 GB VRAM or lower were western AAA trash of the worst caliber.
The only games that need more than 4GB of VRAM minimum are shitty games that aren't worth playing anyways. Graphics gayry is inversely proportional to how good/fun a game actually is.
>Muh shared memory
All the games shitting the bed were on PS4 as well. That argument doesn't even matter. The games were just incredibly shitty ports and the reason why we keep getting those isn't the VRAM no longer being enough - It's publishers not giving their devs enough time to optimize anything - they just throw everything onto the market to hit the numbers for their next financial report.
While some ports are indeed inexcusably bad (looking at you, Jedi Survivor), I do think devs get more flack than they deserve. The fact of the matter is that optimizing for PC is much harder than for consoles, especially when Ngreedia and AMD keep being stingy with their VRAM and gamers refuse to upgrade their hardware because the newest hardware is too expensive.
Another party to blame here is the Unreal Engine 4. We shouldn't forget about that clusterfuck of an engine. The one thing that engine had going for it is that it made game development easier than ever, yet made optimizing games more difficult than ever. The Unreal Engine 4 is really starting to show its age, and simply cannot handle what devs are trying to do now in 2023, neither on PC nor on console.
Seriously, we often bitch about PC ports being garbage, but most of these games do not run any better on consoles.
Now of course the simple solution would be to just make games less graphically demanding, but then you'd have idiot gay zoomers complain that the game looks like crap, despite their shitty PC not being able to run the game on 'max' graphics anyway.
Modern games look like crap anyway, because of bad art direction and smearing TAA everywhere to cover up dithering hacks.
>AMD broke CPU stagnation and forced core counts and IPC to go up while maintaining affordable pricing, everyone won
>AMD does fuck all in the GPU space and it's still stagnant
>$269
>8GB GDDR6
>128-bit bus
>x8 lanes only still
AMD looked at the 4060 and found a way to make a product just as bad.
>6700 is $269 on amazon right now and performs better and has 10gb vram.
shut the fuck up you're supposed to remember RDNA2
Looks like if you wanted an RX580 that can do DX12, this is it. I'm OK with it.
why is AMD's numbering so fucking retarded? holy shit, i buy nvidia just because AMD's branding fucking sucks ass