https://www.eurogamer.net/digitalfoundry-2023-diablo-4-is-a-great-pc-port-except-you-need-a-16gb-graphics-card-to-match-ps5-quality
PC sisters...
Thalidomide Vintage Ad Shirt $22.14 |
DMT Has Friends For Me Shirt $21.68 |
Thalidomide Vintage Ad Shirt $22.14 |
fake news /sadge
...the source is right there in the OP.
>the source is this fake news site
>eurogamer and digital foundry
>fake news
Meds. Now.
>eurogamer and digital foundry
>fake news
yeah
I'm not clicking that shit because it's:
>False
>Clickbait
That being said, I anticipate the issue boils down to tech illiteracy. Are both games running at the same exact internal resolution, with the exact same shadows, post-processing, AA, and all other settings otherwise at parity? I would bet not, because the number of my coworkers who are under the impression that PS5 plays games at 4k also fail to understand that internal resolution and upscaling are very different things.
Oh, I also forgot to include: OP is a gay.
I have literally zero issues with the game on my 8GB card on max settings and 4K textures outside of stuttering when other players show up and the netcode shits itself. This article is bullshit clickbait
Didn't mean to quote
you're running out of VRAM dude. plenty of people have tested it. it sucks ass.
I have a 3090 and it stutters.
yeah it stutters normally but the vram stutters are like that only worse and more frequent.
What resolution though?
Journos going hard for bait
>Um actually a 1660s is more powerful than a 3070
nani
Every fricking "news headline" here is clickbait shit that op wants you to react to for attention. just sake all around and filter
clickbait tbqh
you definitely need 16GB of VRAM to run ultra textures without horrible stutters. Is the PS5 running ultra? I wouldn't have thought so.
>Is the PS5 running ultra? I wouldn't have thought so.
I believe it is. Because Diablo 4 is overall not a very graphically intensive game it leaves a lot of room in the unified memory for the textures.
What's up with all the schizoposting calling this "fake news"?
Have people here genuinely never heard of eurogamer and its youtube arm digital foundry?
PCisraelites can't stand the fact that their RGBLT rigs can't compete with consoles anymore
>RGBLT
>Red Green Bacon Lettuce Tomato
That Canadian guy has a sandwich?
More like consoles have reached a level where someone's half-assed build from 5 years ago can't match the console version of a game actually optimized for the hardware.
Plus, the PS5 has 16 GB of unified RAM, meaning it can use unused RAM as VRAM, 16 GB VRAM being a lot more than most modern GPUs have.
so when are zoomers gonna figure out that you can lower settings
Isn't the issue something regarding shared 16GB of memory on consoles, so lazy ports just demand 16GB graphics cards. I remember some people saying that was the case with Gollum.
This is because the Nipponese are masters of folding over VRAM, they fold it 1000 times over compared to the paltry number PC has.
You laugh but MTframeworks was Carmack-tier witchcraft during its time.
>consoles have more VRAM/RAM
>which means devs target that hardware
WTF AEIIIII AAAAAAAHHHHH NOOOOO
There isn't a console on the market that has 16GB vram. The PS5 has 16gb shared ram of which some portion goes to system resources, then regular ram usage and then VRAM. It's very unlikely for a PS5 game to utilize more than 12gb vram.
It's also very obvious when we look at a recent example like RE4 remake. With a 10GB vram card the highest setting that runs stable has better textures than the PS5 version of the game. If the PS5 version allows the game to use more than 10gb of vram, why does it have worse textures?
How? I was playing the open beta with a 3070ti 1440p high with 60 frames. I was only play alone but it ran well for me.
sounds like bad optimization because the piss5 has 16gb TOTAL shared memory
your pc has 16gb + 8 dedicated to gpu
PS5 has 16gb of shared memory which utilizes its storage as a low latency buffer instead of traditional data parking, the storage throughput is fast enough to instantaneously flush and repopulate the entire shared memory pool at any given moment, effectively lifting the limitation of memory entirely
the recent influx of titles eating all your VRAM isn't bad optimization, it's just a console port
>Diablo4's PC port
lol no wonder Diablo is so shit it's made for smartphones and then ported to PC nowadays
What if OP is the homosexual author of that article?