Thinking about a potential GPU upgrade for my gaming needs.

Thinking about a potential GPU upgrade for my gaming needs.
But do any of the new releases actually require you to have a 40s series GPU? Would like to upgrade for Cyberpunk or upcoming games like the FF XVI PC port and i got my doubts the 1080 will hold.

  1. 3 weeks ago
    Anonymous

    340€ for a 4060 seems reasonable and would be a HUGE upgrade from your horrible outdated 1080
    basicly any half decent card would mean a major increase in performance from your current gpu

    • 3 weeks ago
      Anonymous

      >340€ for a 4060 seems reasonable
      That's how much a 6700XT costs.
      >HUGE upgrade
      +40% or so isn't a HUGE upgrade.

    • 3 weeks ago
      Anonymous

      Only upgrade if you fell for the ray tracing and 4k meme, you can play any modern game on high setting at 60 fps, 2k resolution with a gtx 1080. Just think about it, is any game really gonna be so much better at 144 fps? or at 4k vs 2k, the answer is no, it won't.

      WOHA A WHOPPING 18% MORE FPS? found the retard.

    • 3 weeks ago
      Anonymous
  2. 3 weeks ago
    Anonymous

    you'll need a 40 series because they support the latest dlss and framegen which all modern devs will use because they're shit at their job

    • 3 weeks ago
      Anonymous

      All RTX cards support the newest 3.5 version of DLSS.

      • 3 weeks ago
        Anonymous

        except FG is only 4000 feature.

  3. 3 weeks ago
    Anonymous

    You can also go with a 16gb A770. The drivers at this point are extremely workable, if not good for pretty much any game at this point, and you'll get more bang for your buck. If you're running any I processor, it also works more in tandem with it. Of course, you're probably still gonna have an issue every now and then, but you're not feeding money into Nvidia's retarded as fuck market ploy. Up to you though, that's not a bad price for the 4060 and you're in desperate need of an upgrade.

    • 3 weeks ago
      Anonymous

      >You can also go with a 16gb A770
      I wanna play video games, son.

      • 3 weeks ago
        Anonymous

        Not sure how anyone can seriously recommend the Intel GPUs.
        Sure. I am all in for more competition against garden gnomevidia and AyyMD but don't tell people to buy a new GPU line without telling them what they are getting into.
        You wanna support Intels little project for the benefit of all and know the downsides and risks? Sure. Great. Go for it.
        But telling people to just buy an ARC GPU is either stupid or evil.

  4. 3 weeks ago
    Anonymous

    Why the fuck would you buy a xx60 card? Just get a 3080 for likely the same money and better performance.

    • 3 weeks ago
      Anonymous

      what are you talking about?
      a 3080 is way more expensive and only marginal stronger
      you get a 4060 TI for around 500€ while a 3080 TI will cost you 2 to 3 times as much

      • 3 weeks ago
        Anonymous

        4060 TI's 128 bit bus width seriously bottlenecks a system. Its a price trap Nvidiots are falling into.

        • 3 weeks ago
          Anonymous

          To put this in perspective, the 1080 Ti has a 352 bit bus width. Obviously the 4060 will process faster, but communication with the CPU is slower. Its an upgrade, but not as big as you'd think considering 4 generations of """"advancement""""

      • 3 weeks ago
        Anonymous

        I said 3080. Not 3080ti.

  5. 3 weeks ago
    Anonymous

    Just get a 4080. Buy the 4080, and in 6 years you can update it.

  6. 3 weeks ago
    Anonymous

    you need to wait for 5000 series 4 is already vastly underpowered

  7. 3 weeks ago
    Anonymous
    • 3 weeks ago
      Anonymous

      He's not entirely wrong. Assuming both cards are sold at MSRP, so $300 for the 4060 and $270 for the 7600. But the RX 7600 can be found for as low as $230 these days: at this price, the 7600 is obviously the better product.

      Also, at $300 the 4060 is so close to the 6700XT, you might as well drop $30 more for that card instead. The 4060 is shit even compared to Nvidia's older products, you can get a 3060 12GB for around the same price.

      The 4060 is just garbage. The 4060Ti is even worse, and the 4060Ti 16GB is a complete joke. There are only TWO Nvidia cards worth buying this generation: 4070 (the 7800XT is better of course, but the 4070 is decent and worth considering if you hate AMD) and 4090.
      At any other price range, AMD is so far ahead only a complete moron or brainwashed fanboy would buy Nvidia.

      • 3 weeks ago
        Anonymous

        He is wrong in every case.The admin in charge of userbenchmark is known to be a massive AMD-hater. Basically every hardware forum - even Intel ones - have him blacklisted and instantly ban you if you post anything from him.
        >don't buy AMD GPUs
        >Why?
        >because their benchmarks are fake and only marketers like them
        >also
        >Ryzen sucks and the 5800X3D is literally worse than some 3 years older Intel midrange CPU.
        >why?
        >because AMDs numbers are fake fuck you

        • 3 weeks ago
          Anonymous

          Nothing he says is wrong, he was just profaned by Advanced Marketing Devices shills for lies such as the Ryzen 2 series having equivalent gaming performance to Intel 9th gen, for which it didn't. Do you really want CIA proton spybotware on your PC?

          • 3 weeks ago
            Anonymous

            Sure. AMD is literally paying all the tech reviewers on every site and every youtuber to get ahead in benchmarks and neither Nvidia or Intel give a shit about it. Only userbenchmarks knows the truth. Every other reviewer is a shill.

        • 3 weeks ago
          Anonymous

          I know it's Userbenchmarks. But of all the bullshit he came up with, suggesting the RTX 4060 over the RX 7600 is somewhat reasonable.
          The 4060 is bad at $300 for the reasons I explained in my previous post, but the 7600 was even worse at $270.

          • 3 weeks ago
            Anonymous

            You live in a fantasy world, mate.

            • 3 weeks ago
              Anonymous

              ..? I'd rather buy a 4060 for $300 than a 7600 for $270. At least I get DLSS.
              The 7600 should've been a $230 card right from the get-go, which is how much it's selling for nowadays: NOW it's decent. Not good, it should've replaced the RX 6600 at $199 imo, but decent.
              At launch it was more expensive than AMD's own RX 6650XT. Same performance. You were paying $50 more for... uh... AV1 encoding I guess?

      • 3 weeks ago
        Anonymous

        Yeah he's right. I was heavily considering a 4080, but it just screamed heavily gimped at me. So I got an XTX. At this point, I'll jump ship only if and when Nvidia gets their head out of it's ass. Small chance of that though, it takes genuine hubris to claim artificial stock control is the same as Moore's law.

      • 3 weeks ago
        Anonymous

        I thought the 4070Ti was also a good card for the money? I just bought one last week so it better be

        • 3 weeks ago
          Anonymous

          It gets its ass kicked by the 499 USD 7800 XT but compared to the 4060 or 4060 Ti its a better deal.

          • 3 weeks ago
            Anonymous

            Wrong. The 4070Ti competes with the $760 7900xt
            The $500 7800xt competes with the 4070 and stomps all over the 4060/Ti by a massive margin.

    • 3 weeks ago
      Anonymous

      Lisa Su fucked userbenchmark's mom.

  8. 3 weeks ago
    Anonymous

    >But do any of the new releases actually require you to have a 40s series GPU?
    No they don't. I'm an Nvidia user but the 4000 series is a huge scam at current pricing. The only one worth going for right now is the 4090 and that's the card you go for if money isn't an object and you just need the best. The other cards are just terrible purchases. Frame generation isn't really going to help you if you're not already getting good framerates, it's a "win more" gimmick that does its best job by making you go from 70fps to 120 smoothly, but it'll work poorly if you're trying to go from 40 to 70.

    The 3000 series is worth buying second hand now at 1080p + light 1440p now that prices are lower than ever. But I'd wait for the 4000 series prices to drop. A used 4070 will probably be great value. Or just go AMD's 6700xt/6800xt, it's a good buy at current prices.

    • 3 weeks ago
      Anonymous

      What's the catch with AMD cards? Considering a 6800XT buy I've never used one before.

      • 3 weeks ago
        Anonymous

        Every single one I've ever owned has died/croaked. IMO they're of poor quality. That's just my personal experience and opinion though.

    • 3 weeks ago
      Anonymous

      We’re never going back to 2 for $5 big mac’s and we’re likely not going back to $699 top end GPUs.

    • 3 weeks ago
      Anonymous

      It's why I went with a Sapphire XTX to replace a 2070. Amazing card. Quieter than my water cooled CPU at full load several hours in playing Starfield at Ultra everything at native res without FSR.

      • 3 weeks ago
        Anonymous

        obviously performance is important but those nitro cards look great as well imo

  9. 3 weeks ago
    Anonymous

    Real question is, why the fuck are you still gaming at 1080p? Of course new games require the 4000 series, hell, they require the 5000 series.

    • 3 weeks ago
      Anonymous

      You don't need 1440p/4K.

      • 3 weeks ago
        Anonymous

        SHALL NOT BE INFRINGED

      • 3 weeks ago
        Anonymous
    • 3 weeks ago
      Anonymous

      I don't own a 4k monitor and even if I did, there's no point in having a 24' in that resolution. You'd not really notice any major detail raise. When I get a bigger, better monitor, I'll upgrade from my A750. It does what I need it to do with gusto.

    • 3 weeks ago
      Anonymous

      My monitor only supports up to 1080p and it works perfectly fine. Buying a new one while this one is still perfectly functional feels stupid.

    • 3 weeks ago
      Anonymous

      70% of players still use 1080p
      games today aren't good enough to warrant sinking lots of money into hardware for most people

      • 3 weeks ago
        Anonymous

        This, and even if they were, PC is quickly becoming a tertiary development concern after mobile. Respawn fixing Jedi Survivor but only for consoles is fucking criminal.

        • 3 weeks ago
          Anonymous

          It's been mostly fixed for PC long ago, anon. The performance has been matching the recommended requirements since patch 2. People are still whining because it's activism to stop them from doing that same shit again.

          The game still shits itself if you try to raytrace on Jedha, but that happens on consoles too.

    • 3 weeks ago
      Anonymous

      My GTX 680 died, so I bought a "placeholder" RX 580 at the end of 2018. I was ready to upgrade both my monitor and GPU with a midrange-to-highend Ampere or RDNA2 card... and then COVID+ETH happened.
      Current cards are overpriced, so I'd rather not buy one. But it makes no sense to upgrade just my monitor, considering how shit my GPU is. So I'm JustWaiting(tm) for the next generation.

      I wasted my money on a Steam Deck instead.

      This, and even if they were, PC is quickly becoming a tertiary development concern after mobile. Respawn fixing Jedi Survivor but only for consoles is fucking criminal.

      Jedi Survivor used to run like shit everywhere. It's just that the average console users doesn't notice when the resolution drops to 720p and/or the framerate tanks to 40 FPS.
      Wanna know WHY it ran so bad on consoles? Because Raytracing was enabled by default, and you couldn't disable it in any way. The game fucking dropped to 600p in a desperate attempt to maintain a playable framerate... but hey, look at how realistic those reflections are. That's how much "care" they put into the console version.
      The latest update allows users to disable RT, and from what I've seen it runs at 60-ish FPS now.

      • 3 weeks ago
        Anonymous

        I have a 1070 and I really want to play Jedi Survivor. I'm cheap as fuck and will probably wait for the PS4 release since I don't know if the patches have been good.

        • 3 weeks ago
          Anonymous

          They've been pretty good on pc AND ps5. It's gotten to a point where it actually matches it's specs.
          I've seen it playable on ally, and that's a reasonable benchmark I think, that 7840u/z1 extreme chip is pretty middle of the road right now (and a lot of games seem to be targetting it for at least 720pmed since the handheld pcs are all using it, plus the steam deck)
          No clue about how PS4 will handle it though.

        • 3 weeks ago
          Anonymous

          The latest patch finally let you turn off gaytracing and instantly boosted fps by 50%. Playable at least now.

        • 3 weeks ago
          Anonymous

          >1070
          You're going to have make quite a bit of compromises but it's not unplayable. A 1660ti (similar to a 1070) can do about 40-50fps on 1080p high, without FSR on.

    • 3 weeks ago
      Anonymous

      I have a 24" 1080p 240hz monitor and I only play competitive shooters. There are no good 24" 1440 240hz monitors with good IPS panels.

      • 3 weeks ago
        Anonymous

        >i only play competitive shooters
        >at 1080p
        Go away.

  10. 3 weeks ago
    Anonymous

    Get an Intel a770

    40XX nshitya or 7XXX AMKEKS are literally overpriced fire hazards

  11. 3 weeks ago
    Anonymous

    Still too expensive for the 4060

    You can get a 6700 XT which is like two years older. For the same price or less. Which delivers better performance in most titles.
    The 4060 is one of the worst GPUs Nvidia has ever made. Only the 4060 Ti is even more retarded cause it literally loses against the 3060 Ti in pretty much every game. A literal worthless "upgrade"

  12. 3 weeks ago
    Anonymous

    >Buy AMD
    >No path tracing
    >Buy Nvidia
    >Not enough VRAM under $1000
    You get fucked either way. I've settled for waiting until the next generation of cards.

    • 3 weeks ago
      Anonymous

      RDNA 3 GPUs are "okay" for Raytracing at this point with the exception of Cyberpunk which has a massive Nvidia bias in that regard. But outside of Cyberpunk the difference in Raytracing performance is like 6-8% at worst within the same price-range.
      Which is tolerable compared to previous generation where AMD raytracing was basically not usable at all. And considering that AMD still comes with a decent amount of VRAM even on more budget GPUs was enough for me to make the switch.

      • 3 weeks ago
        Anonymous

        AMD isn't as far behind in Raytracing as people believe. Or rather, you get similar RT price/performance compared to Nvidia, but better raster price/performance of course.
        Only in Nvidia-sponsored tech demos AMD gets demolished. In actual games, they're close.

        What AMD lacks is DLSS. It started as a complete joke (DLSS 1.0-1.9), it became decent two years ago (DLSS 2.0-2.2)... but now it's a real killer app. It's hard to ignore, and I say this as someone who used to laugh at this absolute meme of an upscaler not that long ago.
        In order to be competitive, AMD has to sell much better products for the same amount of money as Nvidia. Which is what they're doing these days, but it's not sustainable in the long run. They have to come up with a proper competitor, and soon.

        What's worse, it seems DLSS is improving faster than FSR these days.

        7800 XT doesn't have any major advantage over 6800 XT in raytracing from what I've seen. The games with light raytracing like RE4 aren't really an issue with AMD cards. The disadvantage will only show up in fully path traced games like Quake 2 RTX, Portal RTX, Serious Sam RT, Half-Life RT, Cyberpunk 2077 in Overdrive mode, Alan Wake 2, etc. In those games, it's not even close. AMD has great value for raster performance, but the RT performance isn't there when it comes to games that make full use of every aspect of RT. But then Nvidia dicks you over at the lower price tiers with insufficient VRAM.

        • 3 weeks ago
          Anonymous

          That's what I meant with "Nvidia-sponsored tech demos". No one actually -plays- Portal RTX or Quake RTX, these are just marketing. In actual games with reasonable RT effects (Spider-man, Metro Exodus, recent Resident Evil games...), AMD isn't that far behind.

          Basically, it's like tessellation ten years ago. In actual games with reasonable amounts of tessellation, Nvidia and AMD were neck and neck. Increase tessellation to insane, unreasonable levels and Nvidia would win.
          But it didn't really matter because no one would play games with tessellation set to 64x, not even Nvidia owners. Because it was dumb.

          Let's take Cyberpunk for example. Percentage wise, the 7800XT gets demolished by the 4070, right? But neither card can run the game at a decent framerate, so it's a meaningless win in practice.

          • 3 weeks ago
            Anonymous

            >Basically, it's like tessellation ten years ago. In actual games with reasonable amounts of tessellation, Nvidia and AMD were neck and neck. Increase tessellation to insane, unreasonable levels and Nvidia would win.
            Except here, you can actually see a difference.

            • 3 weeks ago
              Anonymous

              ORT is just another Nvidia tech demo like Portal RTX and Quake RTX.
              AMD hardware isn't supposed to run it. It literally only exists there to shill the 4090.

              Once other games start using path tracing we'll see how decent AMD can handle it.

    • 3 weeks ago
      Anonymous

      AMD isn't as far behind in Raytracing as people believe. Or rather, you get similar RT price/performance compared to Nvidia, but better raster price/performance of course.
      Only in Nvidia-sponsored tech demos AMD gets demolished. In actual games, they're close.

      What AMD lacks is DLSS. It started as a complete joke (DLSS 1.0-1.9), it became decent two years ago (DLSS 2.0-2.2)... but now it's a real killer app. It's hard to ignore, and I say this as someone who used to laugh at this absolute meme of an upscaler not that long ago.
      In order to be competitive, AMD has to sell much better products for the same amount of money as Nvidia. Which is what they're doing these days, but it's not sustainable in the long run. They have to come up with a proper competitor, and soon.

      What's worse, it seems DLSS is improving faster than FSR these days.

      • 3 weeks ago
        Anonymous

        AMD still has a bad reputation as far as raytracing goes thanks to Cyberpunk being THE RAYTRACING TITLE that people like to benchmark. And raytracing was funded and implemented by Nvidia.
        Same with pathtracing really. That also got implemented for Nvidia only and future titles will likely have AMD perform similar to Nvidia due to Nvidia not sponsoring the thing.

        Basically every RT game that isn't Cyberpunk has AMD perform near Nvidia on their recent GPUs.

    • 3 weeks ago
      Anonymous

      Ray tracing is dead in the water.

      • 3 weeks ago
        Anonymous

        yeah that's why literally every meaningful title is releasing with rt features.

        • 3 weeks ago
          Anonymous

          Who gives a shit when it reduces the FPS by half? Only an absolute retard would use RT.

          • 3 weeks ago
            Anonymous

            Who cares? If it looks better that's what actually matters. This isn't even a matter of opinion you're just being a fucking retard making your games look worse for no reason. Glad devs ignore you though.

          • 3 weeks ago
            Anonymous

            >"when it reduces the FPS by half?"
            >he didn't play Metro Exodus

            • 3 weeks ago
              Anonymous

              >one game
              the absolute state of gay-tracing shills

              • 3 weeks ago
                Anonymous

                The game proves what people who grew up in the 90s and 2000s have known all along, normal rasterized graphics are actually very performance intensive, and don't scale well. Graphics cards get raped by having to do all that work with shadows and SSAO and transparencies, it's why lowering shadow resolution from 2048 to 512 in Skyrim was always the most popular FPS trick, massive performance boost for little visual loss. Not to mention that developers work themselves to the bone making all of that look real because it doesn't actually have physics, and needs lots of handmade tricks to hide it.
                Instead why not build a game around tracing? Devs won't have to crunch for months to handplace lights everywhere, tracing simulates real physics and applies it to the game world without having to resort to tricks, drastically shortening development time: See: https://youtu.be/NbpZCSf4_Yk?si=YJ9x_ekha8g5b3m6&t=1408
                And since you freed the graphics card from having to rape itself with rasterization, it can implement better looking graphics with the resulting performance being very similar to the rasterized version, potentially even better because path tracing scales much better than raster does.

                It's insane to me that for years we've had people praising FEAR's lighting system and how it made the environment feel alive instead of static and boring, and now that we have the technology to do a REAL VERSION of that in games there's a bunch of luddite retards (probably still stuck on 750TIs) saying it's a gimmick.

              • 3 weeks ago
                Anonymous

                Because FEAR's lighting made you buy a GPU that cost hundreds (sometimes up to a thousand!) dollars more just to cut your FPS in half unless you use the latest in AI gimmickry to add a smear filter to your visuals in motion.
                Right?

              • 3 weeks ago
                Anonymous

                it did, you used to have to constantly upgrade 90s/early 2000s because of how fast things were progressing

              • 3 weeks ago
                Anonymous

                And now we're back to that. Yaaaaaaaaay!
                Only instead of progressing, we're regressing because devs can't code worth a shit because they're all diversity hires and Indians!! YAAAAAAAAAAY!!!!!

              • 3 weeks ago
                Anonymous

                Yeah. These days its a lot harder to justify PC upgrades.
                The only thing companies like Nvidia or AMD can do is invent new bullshit to cut your frames in half.
                >you didn't beat Cyberpunk if you didn't beat the game with psycho raytracing/path tracing
                Even low end GPUs from 4 generations ago can still play all the games assuming no RT is involved. And games released these days hardly look better than anything released 10 years ago. You need bullshit modern Unreal Engine features that eat your FPS in order to kill frames. That or asking for massive VRAM due to trash optimization.

              • 3 weeks ago
                Anonymous

                >Because FEAR's lighting made you buy a GPU that cost hundreds (sometimes up to a thousand!) dollars more just to cut your FPS in half
                As I said, the game I posted is proof you don't need to kill your framerate or get a good GPU. That game has an RTX2070 recommended for 1080p60fps and you can buy a used one for under 200 bucks. Do you want to go in circles like this?

              • 3 weeks ago
                Anonymous

                You specifically mentioned "luddites with 750s" so it's plainly obvious you're a paid marketer shill.

              • 3 weeks ago
                Anonymous

                Good job on not addressing any of my main arguments. Last (You) you'll get from me.

              • 3 weeks ago
                Anonymous

                Got caught shilling!
                >aDdReSs mY PoInTs!!
                Okay.
                Your main point is that raytracing will make "FEAR-like" lighting return to impress everyone again, and only luddites with 750s are mad about it.

      • 3 weeks ago
        Anonymous

        Why?

  13. 3 weeks ago
    Anonymous

    No, a 3070 or 3080 can run everything right now, maybe not 4k/ultra but that should be obvious.
    If you want a new device that will likely carry you up to and through most of this generation, 4070 is the way to go. PS5/xbone2 have 16gb unified RAM, so the 12gb of vram that the 4070 has means you're likely going to be able to run most games at 1440p for the remainder of this gen, 0 issue, and a good number of games at 4k for a few years, as the unified ram on console is eaten away by underlying processes to a number likely nearing 2gb or 4gb (to be constantly recording, handle party chat, other OS level shit)

    If you want to upgrade for cyberpunk and FFXVI and want to use a handful of raytracing features they have, then yes, a 3080 or 4070 are probably your best bet. 3080 is better, but 4070 is cheaper, and MUCH lower power draw.

  14. 3 weeks ago
    Anonymous

    Whats the appeal of the 7800XT of it's performance is the same as 6800XT?

    • 3 weeks ago
      Anonymous

      RDNA 3 which means it will support all the fancy AMD features that RDNA 2 may or may not get at some point.

      Chiplet design means there is still room to grow as far as raw performance goes.
      AI Acceleration is actually a thing here unlike previous gen so of you really wanna get that Stable Diffusion stuff going without snailpace - You can do that now.
      Stuff like FSR3, Anti-Lag, ANti-Lag+ or Radeon Super Resolution.

    • 3 weeks ago
      Anonymous

      It's performance is better than a 6800xt
      It also has AI cores to leverage FSR3's features at a hardware level. All non RDNA3 GPUs and competitor GPUs will run FSR3 in software. Also AV1 support.

  15. 3 weeks ago
    Anonymous

    Don't buy 4060 you fucking retard. RT performance will be shit so you will have it disabled anyway, and it has low vram. Buy 6700 XT, dumbass.

  16. 3 weeks ago
    Anonymous

    >no nvidia gpu with 3060ti specs but 12gb like the non ti version
    I sleep

  17. 3 weeks ago
    Anonymous

    am I supposed to upgrade my 1080ti yet? it has been 6.5 years

    • 3 weeks ago
      Anonymous

      no
      i would say next generation could be the cutoff, at that point a 5050 might beat it

      • 3 weeks ago
        Anonymous

        I'm skeptical. The 128 bus on the 4060 Ti feels like intentional sabotage to trap lower tier buyers with a card they'll have to upgrade in 2 years. You'll have good options if the 4070 drops in price, but I don't think Nvidia will ever release a legitimately valuable card at a reasonable 50/60 price

        • 3 weeks ago
          Anonymous

          >You'll have good options if the 4070 drops in price, but I don't think Nvidia will ever release a legitimately valuable card at a reasonable 50/60 price
          yeah, i'm absolutely not saying you should buy the 5050, just that it might finally beat a 1080 ti

    • 3 weeks ago
      Anonymous

      I'm on a 1070 non ti and I'm finally feeling the squeeze with brand brand new titles, had to start dipping under 1080p. I figure next gen of cards will be what I upgrade with (if prices are good). If they keep prices high, I'll look for a secondhand 4090 or something.

  18. 3 weeks ago
    Anonymous

    >was looking forward to the 7800XT
    >has like 100W more consumption
    >FSR 3 half a year away
    It better be 40% faster than the 4070 otherswise theres no reason to get the 7800XT since pricing is similar

    • 3 weeks ago
      Anonymous

      The 4070 is like 50-70 bucks more expensive here.
      It also loses against the 7800 XT but only by like 5-10%.
      Raytracing performance is around the same with the 4070 being like 2-3% better on average.

      The 4070 Ti is a lot better than the 7800 XT but the 4070 Ti is also like 800-900 bucks so fuck that. Not worth the 15-20% up in performance. For 900-950 you can already get a fucking 7900 XTX and be set for the next 8 years.

  19. 3 weeks ago
    Anonymous

    going from a 1080 to a 4060 is not worth the upgrade. The gap between these 2 cars is not big.

    Not even joking when I say there's a bigger gap between a 4060 and 4070 than from a 1080 to a 4060. The 4060 is a major scam my dude.

    • 3 weeks ago
      Anonymous

      If you're upgrading from a 1060 to a 4070 it's a big upgrade. If you're upgrading from a 1080 you need to get 4090 to even see a huge upgrade. If people are buying gpus for just video games just get AMD or NVIDIA if you have the money.

  20. 3 weeks ago
    Anonymous

    I just think its funny how a 1080 Ti is still able to play basically every game just fine simply for the fact that it has 11 GB of VRAM. Six years later and still doing it. 60 FPS in Cyberpunk with medium settings without any FSR/DLSS bullshit.

  21. 3 weeks ago
    Anonymous

    >userbenchmark

  22. 3 weeks ago
    Anonymous

    Userbenchmark is so funny.
    Too bad they stopped uploading stuff on youtube.

    >show ryzen on one side and Intel on the other side
    >benchmark tool shows Ryzen being better
    >they put a different number right in the middle of the screen saying Ryzen is actually slower than Intel
    >dislikes disabled

    • 3 weeks ago
      Anonymous

      He had a bad AMD product once. Please understand.

      • 3 weeks ago
        Anonymous

        People say that owner of the site is actually a pissed off ex-AMD employee who got sacked.

        Surprised AMD isn't suing him considering the constant fake news. Everyone agrees that he is full of shit and the site is blacklisted in every tech discussion forum.
        The fact that he is still one of the first google results despite all this should be enough for them to go after his ass for defamation.
        I can't just make a site and claim that Coca Cola is causing turbo cancer.

        • 3 weeks ago
          Anonymous

          Being a top Google result doesn't surprise me. If not for defamation and slander, Google would go out of business.
          But for AMD to not care? That is strange.

          • 3 weeks ago
            Anonymous

            Yeah. Even Nvidia and Intel fanboys agree that the stats on his site are fake bullshit and ban people on their platforms who use it for arguments.
            When even shills from the competition say that you are taking it too far you should seriously start questioning your life choices.

            • 3 weeks ago
              Anonymous

              When did it start happening? I remember back in 2015 and back it wasn't so bad, or at least so blatant. Was it always this way?

Your email address will not be published. Required fields are marked *