4070 is now a 4080

Novidiot gaymers, prepare to get scammed again
https://videocardz.com/newz/nvidia-geforce-rtx-4090-rtx-4080-16gb-12gb-max-tgp-and-gpu-clocks-specs-have-been-leaked

Schizophrenic Conspiracy Theorist Shirt $21.68

Homeless People Are Sexy Shirt $21.68

Schizophrenic Conspiracy Theorist Shirt $21.68

  1. 2 years ago
    Anonymous

    didnt they do this with the 1080 as well? it used a GP104 which is normally used for the 60/70 class cards.

    • 2 years ago
      Anonymous

      And the 2080. And 2080Super. And the mobile 3080/Ti

      If you don't buy the flagship, do not expect Nvidia to give you the top chip

      And don't forget that there were TWO Pascal Titans, released 6 months apart and only one of them had the full die like a Titan should

      God bless AMD for giving nvidia the kick in the ass they needed. And I hope Intel finds success too because duopoly is only 2nd to monopoly in terms of being cancer

      • 2 years ago
        Anonymous

        >God bless AMD for giving nvidia the kick in the ass they needed
        Evidently it didn't fricking do anything.

        • 2 years ago
          Anonymous

          People forget how HUGE nvidia is as a company. Even if AMD does everything right it would take them years to bring nisraelite to their knees.

        • 2 years ago
          Anonymous

          Nvidia has too much momentum, after being dominant for so long. Most people don't autistically follow hardware news and benchmarks.
          It takes normies a few generations to get with the program that nvidia isn't the "only choice" anymore.

          • 2 years ago
            Anonymous

            We are half way there. Navi has competed well and if AMD continues to pump out strong products + nvidia keeps being israelites with pricing, the sticker shock will get to normies. Nvidia caught a lucky break with the chip shortage

    • 2 years ago
      Anonymous

      they've been doing that since kepler 600 series, where have you been?

  2. 2 years ago
    Anonymous

    >They literally just bumbped the videocards up a tier to charge more money without announcing a price-hike
    Holy shit.
    >They don't have a 4070 anymore
    >The card most people would care about as the best "Value" option as an upgrade
    >For now you need to go go for an overpriced 4080 that nobody realistically needs
    >or cope with the 4060 a tier lower whenever they decide to launch it

    • 2 years ago
      Anonymous

      Ah yes, the same thing intel did with i3/5/7/9
      I forget which gen it was, but they moved the featureset of the i7 into i9(hyperthreading), and started calling chips with an i5 feature set i7s and pricing them as i7s.
      Jews the whole lot of em.

  3. 2 years ago
    Anonymous

    Society is collapsing.

  4. 2 years ago
    Anonymous

    I wonder if this is going to cause hilarious obscure driver issues with the rename.

  5. 2 years ago
    Anonymous

    only idiots buy new series when they come out. just get a cheap 3060 it's more than enough for any game worth something.

    • 2 years ago
      Anonymous

      Wanna know how I can tell you're a poorgay with no 4k display?

      • 2 years ago
        Anonymous

        >see 4k display
        >I am blind or it is a meme, I see no difference
        >zooms insist that it is "like liquid gold in your eyes"
        I am not 15, homosexual.

        • 2 years ago
          Anonymous

          Ignore all zoomers that waste money on top shit because their favorite streamer endorses it.

          • 2 years ago
            Anonymous

            Exactly. They are the only ones excited for 4k, when the only real advancement was from 480p to 720p or even 1080p.

        • 2 years ago
          Anonymous

          To be honest, if you really don't see the difference between 1080p and 4k, then all that money is a waste for sure. There may not be as much of a significant jump as from 720p to 1080p, but let's not pretend 4k is not noticable.

  6. 2 years ago
    Anonymous

    >will just buy 3070 and sit on it for 5 more years as I did with 1070 anyway

    • 2 years ago
      Anonymous

      >upgrades from a 6 year old 8gb vram card to a 8gb vram card
      >expects it lasts 5 years
      lol

      • 2 years ago
        Anonymous

        >he doesn't know that israelitevidia slowly crippling old cards' performance with each driver update

        im happy with 1070's performance 🙂

      • 2 years ago
        Anonymous

        yeah, what an idiot, I love playing the newest feminist PoC simulator at 4k 120 fps

      • 2 years ago
        Anonymous

        you homosexuals are so dumb, unless you're going to do 4k 120hz 8gb vram is fine

        • 2 years ago
          Anonymous

          Only in games designed for PS4/Xbone

          • 2 years ago
            Anonymous

            do you seriously think "next gen" consoles are more powerful than a gtx 1080? lmao

            • 2 years ago
              Anonymous

              you're entirely right in the point you're making, but also, Control is a terribly optimised piece of shit performance-wise and there are far better examples that you could use

        • 2 years ago
          Anonymous

          11GB of VRAM is insufficient at 1440p without downgrading textures to early 8th gen levels. Have you even played a recent game? Is this the part where you cope by stammering out a "w-well, new games suck anyway"?

          • 2 years ago
            Anonymous

            >"w-well, new games suck anyway"?
            But they do suck

            • 2 years ago
              Anonymous

              >you can play new games with an old GPU!
              >no you can't, do you even play new games?
              >no they suck

              • 2 years ago
                Anonymous

                >you can play new games with an old GPU!
                Yes you can.

              • 2 years ago
                Anonymous

                Maybe, but not The Way It's Meant To Be Played™

              • 2 years ago
                Anonymous

                >It's the end of the world if any new pozz game doesn't run at 144fps, at 4k!
                You can run shit even at 720p, homosexual.
                >BUT INFERIOR SETTINGS
                Nobody cares, if the game is pozz.

              • 2 years ago
                Anonymous

                hardware discussion in a nutshell

          • 2 years ago
            Anonymous

            I play modern games at 1080p 50FPS on my gtx 970 3.5GB. Changing monitor to 50hz gives you a free 20% performance boost, and it looks just as smooth same as 60hz

          • 2 years ago
            Anonymous

            Not true. A game will use as much RAM as it can get whether it needs to or not. Does it stutter with 8GB RAM? No? Then shut the frick up.

    • 2 years ago
      Anonymous

      >he doesn't know that israelitevidia slowly crippling old cards' performance with each driver update

      • 2 years ago
        Anonymous

        >driver update
        why do you do this?

        • 2 years ago
          Anonymous

          because hes trying to pretend hes not an NPC but in reality hes the biggest one of all

  7. 2 years ago
    Anonymous

    >That wattage increase

    What the frick? Am I going to need a 1000w psu now!?

  8. 2 years ago
    Anonymous

    why cant they just release the bing bing wahoo video game hardware without israeli frickery. god damn

    • 2 years ago
      Anonymous

      capitalism ruins everything it touches

  9. 2 years ago
    Anonymous

    i heard the 4070 was barely more powerful than the 3070, so does this mean the "4080" is going to be less powerful than the 3080?

    • 2 years ago
      Anonymous

      The "4070" should still be more powerful than the 3080
      There will be a main 4080 with 16 GB vRAM and a meme 4080 with 12 GB for scamming purposes

      • 2 years ago
        Anonymous

        >and a meme 4080 with 12 GB for scamming purposes

        Primarily for prebuilds so they can show an 8 instead of a 7 or 6 because 8 is higher.

      • 2 years ago
        Anonymous

        The 4070 TimeSpy Extreme performance looks like its about equal to the 3090 FE

    • 2 years ago
      Anonymous

      Fricking mentally ill moron

  10. 2 years ago
    Anonymous

    >still using my 1060 6gb
    I know i'll be forced to upgrade eventually but I haven't seen a single video game worth playing yet that would require it.

    • 2 years ago
      Anonymous

      i'm upgrading solely because i want to try Stable Diffusion, AAA video games fricking suck

      • 2 years ago
        Anonymous

        Shit that's actually a tempting reason now that you reminded me of it.

      • 2 years ago
        Anonymous

        Shit that's actually a tempting reason now that you reminded me of it.

        Isn't stable diffusion perfectly working on any 4 gb vram gpu? From my understanding more power just mean faster work and more vram mean better quality (and therefore a 3060 is best)

        • 2 years ago
          Anonymous

          >Isn't stable diffusion perfectly working on any 4 gb vram gpu?
          only Nvidia cards, but yeah you can get it running even on 2GB VRAM

          • 2 years ago
            Anonymous

            >512x512

        • 2 years ago
          Anonymous

          Working, yes, but the gains in rendering speed are immense.

    • 2 years ago
      Anonymous

      You won't need until PS5 is out of the market, currently the ps4 isn't

  11. 2 years ago
    Anonymous

    yeah i guess i will run my 1050ti till the 5050ti drops

  12. 2 years ago
    Anonymous

    the more I see what games are coming the less I want to upgrade.

  13. 2 years ago
    Anonymous

    Remember me?

    • 2 years ago
      Anonymous

      The chosen one.

    • 2 years ago
      Anonymous

      criminal

  14. 2 years ago
    Anonymous

    Why haven't you taken the AMDpill?

    • 2 years ago
      Anonymous

      no nvenc or nvidia broadcast

    • 2 years ago
      Anonymous

      AMD is just Nvidia with all the same issues, but to a less extreme degree
      Apple Silicon is enlightenment
      >fast
      >optimized
      >insanely efficient, laptops don't even lose performance when pulling the plug
      >runs both x86 and ARM
      >has the best OS, linuxgays and winjeets can cope and seethe

      • 2 years ago
        Anonymous

        can it run vidya doe?
        >t. mac pro m1 haver

        • 2 years ago
          Anonymous

          ok you got me there
          It can sort of game, it's primary issue is the lack of native mac games + microsoft won't sell windows on ARM to consumers since they signed an exclusivity deal with qualcomm (but that deal expires soon)

          The chips are amazing, there's just no software to back it up right now. But hopefully that changes in the future

      • 2 years ago
        Anonymous

        >homosexuals shilling apple of all things
        Holy shit

        Society is collapsing.

        is right

      • 2 years ago
        Anonymous

        >optimized
        Unfortunately not for gaming.

        • 2 years ago
          Anonymous
          • 2 years ago
            Anonymous

            That's one of the very few games that does actually run fine on the M2, and even then, a 6800U laptop is cheaper while performing almost the same and running laps around it or just win by default since at least it runs it in 99% of the other games.

          • 2 years ago
            Anonymous

            I look at this graph, and all I see is that with the same power consumption there is not much of a difference in performance, probably only due to better node they bought exclusivity for.

            • 2 years ago
              Anonymous

              >nodez are unfair!!
              And the mac is emulating x86 while the PC runs native code. Results are what matters

              • 2 years ago
                Anonymous

                And both are unnacceptable levels of performance showing that gaming on laptops without a gpu is a waste of time. Sure the cherrypicked performance per watt graphs make them look nice, but when you look at what you're actually getting it's laughable.

              • 2 years ago
                Anonymous

                >nodez are unfair
                I didnt say that you moronic appleBlack person.

      • 2 years ago
        Anonymous

        >Apple Silicon
        They just order best chips from taiwan and dont push them further to the zone of diminishing returns.

    • 2 years ago
      Anonymous

      Took it last year

    • 2 years ago
      Anonymous

      Mostly the poor ray tracing performance. AMD cards are powerful and have lots of vram which is great for future proofing, but the ray tracing performance is disappointment.

  15. 2 years ago
    Anonymous

    I need VRAM for A.I. images.

  16. 2 years ago
    Anonymous

    >still using my 2060 super because none of my games demand more for a 1440/120 experience
    >will probably upgrade to a 3080 when prices collapse and sit on that for 5 years
    if not for advancements in rtx features, i'm sure even the 2080 ti could last most gamers until 2025

    • 2 years ago
      Anonymous

      I got a 2080S and im still content to wait. I can run Cyber punk and Dyling 2 with rtx and Dlss maxxed at good framerates

  17. 2 years ago
    Anonymous

    Greedvidia as usual

  18. 2 years ago
    Anonymous

    What's the point of the name change except to increase MSRP?

  19. 2 years ago
    Anonymous

    I think I'm going AMD this generation, israelitevidia needs to be knocked down a few pegs before they become intel.

    • 2 years ago
      Anonymous

      >jewvidia needs to be knocked down a few pegs before they become intel
      it's too late, they already are Intel, except without the hubris

  20. 2 years ago
    Anonymous

    I'm still sitting on my 1660 super and see zero reason to upgrade.

  21. 2 years ago
    Anonymous

    newbie to PC gaming, live close to a microcenter and was considering a prebuilt from them. Are they a meme like other prebuilts? I've had mixed opinions given to me so far.

    • 2 years ago
      Anonymous

      depends on how fair the pricing is. you'd have to post an example

    • 2 years ago
      Anonymous

      You can at least trust they will give you a pc with a default windows install without bloatware added. As for cases and parts choices, you will have to police them on that because they will prioritze moving old inventory out on you if you let them. My microcenter tried to give me a prebuilt with 3700x charged full new price even though amd 5xxx cpus were out already and they had them in stock.

    • 2 years ago
      Anonymous

      I'm late but best prebuilt I've ever owned was from microcenter. No issues at all after 6 years. Their prices might be fricked nowadays though.

    • 2 years ago
      Anonymous

      I'm late but best prebuilt I've ever owned was from microcenter. No issues at all after 6 years. Their prices might be fricked nowadays though.

      >Live just an hour and a half from a Microcenter
      >Have family and friends that live out near it
      It's a wonderful place

    • 2 years ago
      Anonymous

      put the parts into pcpartpicker.com
      if there is about a $100 surcharge per $400 of parts it's about normal for a prebuilt

  22. 2 years ago
    Anonymous

    >this bullshit again
    I will not, I repeat, NOT pay more than $250 for a GPU. It has to be AT LEAST a 60-class card too.
    >hurr durr you won't ever upgrade then
    Then I will not upgrade. I don't need gay tracing and shit for bing bang wahoo.

  23. 2 years ago
    Anonymous

    cool might buy it, but more likely my gtx970 will get me through another gen because games don't actually need anything more than a 970 anymore
    lmfao at any moron that has bought a card in the last 8 years

    • 2 years ago
      Anonymous

      >my gtx970 will get me through another gen
      We ran out of luck with that card: that shit is never gonna happen again, one of its kind. Praise the 970 and its analogue output.

      • 2 years ago
        Anonymous

        >3.5gb

        • 2 years ago
          Anonymous

          I still use 970 to this day. Still alive and kicking. Its fine because I only play Dota 2 anyway lol.

        • 2 years ago
          Anonymous

          It's by no means perfect or even good by today standards. But it still plays recent games with high settings at 1080. The only problem I've ever had was with Deathloop which is the only current gen game I've deemed worth trying and with just a few tweaks I managed to play it just fine. That card has 8 years and I don't think the stars will ever align again for a scenario like this one (see pandemic, recession, lack of new gen games, and good enough gpu).

      • 2 years ago
        Anonymous

        I still use 970 to this day. Still alive and kicking. Its fine because I only play Dota 2 anyway lol.

        This. I only upgraded to a 2070 from a 970 because i thought 1080p 60hz would be outdated far quicker.

        i still see no point in a 40 series gpu

  24. 2 years ago
    Anonymous

    >it can go up to 366W
    Holy shit who would want this? Not only is that just absurd, but there's already an increase in electricity costs across the globe, especially in western Europe which makes up at least 20% of their sales. Are they just so shit that they can no longer manage performance and power usage?

    • 2 years ago
      Anonymous

      You haven't seen shit if you think that's bad

      • 2 years ago
        Anonymous

        >660W
        This thing must cost at least a few hundred bucks to run every month even at American rates. Euros are gonna go to jail for using this. And how the frick are you going to cool that system? Don't they realize that electricity produces heat? I'm not sure if liquid cooling will fix that issue

        • 2 years ago
          Anonymous

          >And how the frick are you going to cool that system?
          I wish I was joking
          https://videocardz.com/newz/leaked-lenovo-geforce-rtx-4090-legion-gpu-is-a-true-quad-slot-design
          And just you wait for the 4090Ti, rumored to be [spoiler]800W[/spoiler]

          • 2 years ago
            Anonymous

            what an unholy abomination

          • 2 years ago
            Anonymous

            I cannot wait to see people snap their pcie slots right off with these things
            if nvidia sells these things without support brackets they have no conscience

            • 2 years ago
              Anonymous

              That's an AIB model. Nvidia founders editions will stick to the 450W spec and come with an appropriately sized cooler for that wattage. So no bracket
              Including a support bracket for the pic is Lenovo's problem here

          • 2 years ago
            Anonymous

            >13 heat pipe design
            WOOOOWIIEEEE THAT'S ONE HOT BOYYY

        • 2 years ago
          Anonymous

          >660w
          no fricking way, how do you even cool that
          >completely different die for two 4080 models with no indication to distinguish them except vram size
          okay this one seems like typical nvidia behavior but
          >516w on an 80-class card
          what the actual frick?

          >660W
          What is this, a literal furnace? Great I finally have a card that can burn down my house if things ever go wrong.

          It's really funny how zoomers forget how much power people were consuming when doing SLI setups which is what any enthusiast did. You guys are poorgays that think you are worthy of enthusiast tier hardware kek

          • 2 years ago
            Anonymous

            SLI was actual fricking cancer and only for benchmorons. Almost no game was just plug and play with an SLI setup and even when it worked, there would be shit like scan line issues or stuttering or just crashes.

          • 2 years ago
            Anonymous

            and multi-GPU was fricking stupid, it meant work had to be taken away from normal driver improvements in favor of manually hacking use of multiple GPUs into each and every game. Oh your game wasn't popular enough to be deemed worth it for driver mGPU? Tough luck one of your cards is now a $400 paper weight for that game.

            And despite all of this, it still suffers the same issues that make mGPU shit
            >shit framepacing and stutters
            >huge power hog
            >compatibility problems
            >expensive, moronic pricey proprietary bridges which are just fancy wires under the hood and have no right to cost $120

            But now Nvidia wants to bring mGPU power consumption to single card builds

      • 2 years ago
        Anonymous

        >660w
        no fricking way, how do you even cool that
        >completely different die for two 4080 models with no indication to distinguish them except vram size
        okay this one seems like typical nvidia behavior but
        >516w on an 80-class card
        what the actual frick?

      • 2 years ago
        Anonymous

        >660W
        What is this, a literal furnace? Great I finally have a card that can burn down my house if things ever go wrong.

      • 2 years ago
        Anonymous

        so the 4090 is basically a 3090ti that has had its silicon absolutely pushed to the limit
        getting 12900k vibes

        • 2 years ago
          Anonymous

          Not really. The 3090 ti was already pushed to the limit. 4090 is the next generation of Nvidia gpu core design.

      • 2 years ago
        Anonymous

        >rumored
        Useless.

  25. 2 years ago
    Anonymous

    >12gb 4080
    lol bullshit half of these rumors are moronic and make no logical sense
    why the frick would they make two modesl of the 80 card before they even consider making a 70 or 60 card

    • 2 years ago
      Anonymous

      But they did make the 4070
      They're just calling it 4080 12 GB so that they can charge more $$$ for it

    • 2 years ago
      Anonymous

      >what is RTX 3080 10GB and 12GB
      >what is GTX 1060 3GB and 6GB
      >what is like 5 different versions of the GT 1030
      >what is the RTX 3080 mobile 8GB and 16GB (not to be confused with the 3080Ti mobile which is only 16GB)
      enjoy the SKU ladder goy!

      • 2 years ago
        Anonymous

        But they did make the 4070
        They're just calling it 4080 12 GB so that they can charge more $$$ for it

        >the 50 ti is now just a 60 card
        >the 60 card is a ti now
        >the 70 card is a 80 card
        >the 60 ti is a 70 card
        good lord

    • 2 years ago
      Anonymous

      think of it this way: if they called the 4080 12gb 4070 instead people would expect xx70 tier pricing, ie $500 or less, now that it's called 4080 they can charge $700 or more with ease, not to mention scalpers will love the fact that they can get even better margins on what's supposed to be a high end card so all those 4080 12gb models will go flying off the shelf, all because it's called 4080 and not 4070

      it's all marketing frickery

  26. 2 years ago
    Anonymous

    Jesus christ I just want a decent deal on a GPU. I'm still using a GTX 970... There's a 3060ti on Newegg for $420 right now. Maybe I should just fricking buy it

    • 2 years ago
      Anonymous

      That's only $20 over MSRP, do it right away
      >but RTX 4000
      Nvidia is leading with the 4090 only according to rumors, and it won't be until well into 2023 until the 4070 launches. The 4060/Ti will take even longer than that, especially since Nvidia will do everything they can to sucker people into those top 3 SKUs

  27. 2 years ago
    Anonymous

    who gives a frick
    you're not gonna get one anyway

  28. 2 years ago
    Anonymous

    scamvidia

  29. 2 years ago
    Anonymous

    Not surprising, they've slowly been doing this for years.

    • 2 years ago
      Anonymous
      • 2 years ago
        Anonymous

        >GTX 590
        >TWO fermi chips on the same board
        why was nvidia allowed to sell such a massive fire hazard?

        • 2 years ago
          Anonymous

          You joke, but the supposed 4070-masquerading-as-a-4080 in the OP apparently has the same power draw as that dual chip fermi GPU.
          Imagine if they made a modern dual chip GPU, it'd put space heaters to shame.

  30. 2 years ago
    Anonymous

    why are poorgays complaining about power consumption?
    that's just the price you have to pay for better performance, it's normal

    • 2 years ago
      Anonymous

      >Use a UPS
      >oops now your ups can't be used because there is no ups that can stand a pc using 900 watts for even 15 minutes
      based!

      • 2 years ago
        Anonymous

        You will still have the option of undervolting back to reasonable levels. A stock 3060 ti is a 200W card and my undervolt makes it 100-120W card at max, but I hardly ever see max anyway. Performance loss from this aggressive undervolt is only 10% too.

      • 2 years ago
        Anonymous

        UPS aren't made for you to use your PC moron, they're made to buy time when you shutdown in a safe manner.

        • 2 years ago
          Anonymous

          My pc is like 300w under use, so yes i can play for an hour with my ups
          moron

      • 2 years ago
        Anonymous

        Just don't live in a third world shithole where you need a ups.
        Simple as

        • 2 years ago
          Anonymous

          >Just move to a first world place where they call you names and everybody hates you for not being full white
          no

          • 2 years ago
            Anonymous

            You have never stepped foot for even a second in a first world country. Hint, it's nothing like the moronic shit you see on here or the rest of the internet.

            • 2 years ago
              Anonymous

              Ganker is the true final reality show, anon. I have even seen americans disgusted to touch the hand of brown people, irl, you can't convince me otherwise.
              I even have a place where people from other countries can stay for a few nights and few of them are not racist.

              • 2 years ago
                Anonymous

                And I've lived in the USA my whole life in the "worst" state for racism, Georgia while being black. The only place that treated me in a half racist way was a hotel shuttle driver in Wisconsin that refused to drive my family to a wedding until his manager talked to him.

          • 2 years ago
            Anonymous

            >>Just move to a first world place where they call you names and everybody hates you for being white
            FTFY

            • 2 years ago
              Anonymous

              That's fiction.
              Everybody is white, you white kid.
              >BUT LE MEDIA
              The media lies and both the right and left hate blacks and spanish homosexuals.

              • 2 years ago
                Anonymous

                >everybody is white
                what is California
                what is Rust Belt
                what are all big aglomerations

              • 2 years ago
                Anonymous

                why are you in those places if you are white

              • 2 years ago
                Anonymous

                I accept your concession

              • 2 years ago
                Anonymous

                If you say so, I am never going to muttmerica anyway

              • 2 years ago
                Anonymous

                Depends on where you are. I occasionally surf at Huntington Beach and it's like 70% white people, 20% asians, and the rest are hispanic/latino or whatever.

              • 2 years ago
                Anonymous

                Even London isn't majority "white" lol, big western cities are diverse multi culti tutti frutti

              • 2 years ago
                Anonymous

                there are more whites than any other race, that's why it's called "minority". gays

              • 2 years ago
                Anonymous

                "whites" don't have a future demographically in "white" countries (or anywhere really)
                as I've said, in many big western cities "whites" are just another minority

              • 2 years ago
                Anonymous

                >there are more whites than any other race, that's why it's called "minority"
                Let's suppose whites are the majority with 40% of population, that leaves 60% to non-whites. So on the streets you see more non-whites than whites.

    • 2 years ago
      Anonymous

      >it's normal to need a 1200w PSU just so you don't get random shutdowns from power spikes

      >from a single GPU

      no

    • 2 years ago
      Anonymous

      use your brain anon, what other reasons besides having to pay more overall are there? you can do it, you're a big boy

    • 2 years ago
      Anonymous

      Do you think coolers just magically destroy the heat they absorb from chips? No, it has to go somewhere, and that somewhere is into your room
      >I have AC
      If you use whole house AC, every room but the PC room is too cold if you blast it, but the AC can't keep up with the heat output if you don't blast it.
      And not all houses/rooms are able to use a standalone AC, for a number of reasons that are not always related to "poorgay"

      • 2 years ago
        Anonymous

        just get a fan or open your window. next are you going to tell me that your room has no windows and you dont have enough outlets for a fan?

        • 2 years ago
          Anonymous

          >having 800W heater in a room
          >with no AC
          >in summer
          >just open a window bro
          I wish you lived like that for the dumbassery you speak.

      • 2 years ago
        Anonymous

        >If you use whole house AC
        why would anyone do this

    • 2 years ago
      Anonymous

      transient spikes and heat buddy

    • 2 years ago
      Anonymous

      >it's normal
      actually the opposite is true, the rise in power usage is abnormal, that's why so many people are complaining about it

    • 2 years ago
      Anonymous

      yea idk whats all this shittalk is about the most powerful card will be 450w its not even that much

  31. 2 years ago
    Anonymous

    They're gonna announce a xx95 series soon right?
    just move the old xx90 series up one tier and charge more money like how intel did with their i9

  32. 2 years ago
    Anonymous

    That means they know these things will be extremely limited and the ones that will be out in public will be scalped to oblivion.

    They are withholding the rest of the lineup until AyyyMD shows their hand.

    Expect a 4080 "Super" and so forth.

    • 2 years ago
      Anonymous

      With the amount of corporate espionage that's going on, I'm certain nvidia already knows what AMD is going to offer and just makes these moronic super high power cards to keep the "gaming crown". Hopefully their 70 and 60 class cards are less moronic. Since AMD is rumored to still use 6nm for their lower class cards, nvidia might actually be more efficient in the low end if they also use 4nm.

  33. 2 years ago
    Anonymous

    Just buy a 30xx now then when it's cheap. 40xx is just a refresh.

    • 2 years ago
      Anonymous

      This is cope. RTX 40 series is THE generation to get this decade

      • 2 years ago
        Anonymous

        >implying
        The greatest AI stuff ever done is just around the corner, and you will be stuck crying on 12GB of VRAM unable to use any of it.

      • 2 years ago
        Anonymous

        30 series if they had actually been at the real prices outside day 1 would have been that. 40 seires has no official prices yet but is likely going to be even higher than the current 30 seires prices since they will be on the shelf together which will mean Nvidia won't want a 3080 and 4080 on the shelf costing the same or the 30 series will stop selling.

        • 2 years ago
          Anonymous

          this is what killed my interest in the 4000 series

          they can't get rid of the 3000 series fast enough and they won't drop the prices to make way for 4000 series cards so what we'll have is 2 year old cards at MSRP and 4000 series cards sold at a premium

  34. 2 years ago
    Anonymous

    I can only imagine how how 6080 is going to look like.

    • 2 years ago
      Anonymous

      Let's just hope they have multidie working by then, otherwise they'll make every bigger dies they'll have to run at higher and higher frequencies and voltages just to beat AMD.

  35. 2 years ago
    Anonymous

    There's no games that necessitate an upgrade

  36. 2 years ago
    Anonymous

    What drives companies to pull shit like this?

    • 2 years ago
      Anonymous

      >What drives companies to pull shit like this?
      dumb consoomers who kept buying cards relentlessly while they were still 2-3x the price

    • 2 years ago
      Anonymous

      shareholders
      >you only made 582 gorillion last quarter! we expect 900 gorillion at least!

  37. 2 years ago
    Anonymous

    >Buying the card immediately after the generation's big leap forward.
    Why? They always suck.

  38. 2 years ago
    Anonymous

    I don't care about 4k, so I will simply buy a 3000 series card and be happy.

  39. 2 years ago
    Anonymous

    Daily reminder that these new cards require a new PSU with the new ATX 3.0 standard, otherwise expect your PC to shut down everytime you play a game plus expect a housefire because current PSUs cannot handle even 3090 loads and spikes.

  40. 2 years ago
    Anonymous

    All I want is a decent 4060 (Ti)
    >12GB
    >192-bit bus
    >Under 220W
    >Under $400

    Likelihood: 0%

  41. 2 years ago
    Anonymous

    will probably buy a 3080ti in november if prices drop down enough

  42. 2 years ago
    Anonymous

    >9080 gaming room

    • 2 years ago
      Anonymous
  43. 2 years ago
    Anonymous

    >change name
    >now have to pay more money for the same shit
    kek

  44. 2 years ago
    Anonymous

    What do I do if I just want to play some f2p shit and maybe a aaa now and again? My 1070 still runs okay but I want to be prepared. Not in terms of exact parts but like, what do I need overall? How much should I budget? 2k usd? 3? It's solely vidyap I'll be playing. Not looking for 4k or anything just 1080p60. It's qll I really want or need.

  45. 2 years ago
    Anonymous

    I'm not a poor gay. I make plenty of money. I could buy a $4000 PC today if I wanted to upgrade. But there are no new games that are worth a new system. Seriously, where the frick are the games? I guess Dragon's Dogma 2 will be nice. I'll just stick with my 3080 and keep playing these 5 year old unoptimized, spaghetti coded pieces of shit.

    • 2 years ago
      Anonymous

      You can play DD2 with a 1070 most likely Nips
      can't make graphically demanding games to save their lives

      11GB of VRAM is insufficient at 1440p without downgrading textures to early 8th gen levels. Have you even played a recent game? Is this the part where you cope by stammering out a "w-well, new games suck anyway"?

      I just fricking played RDR 2 on ultra settings at 1440p with my 3080 12 gb and it never went over 5 gb vram usage. quit lying Black personboy.

  46. 2 years ago
    Anonymous

    How bad is the VRAM going to be on the 4070? I want to get into VR.

    • 2 years ago
      Anonymous

      10gb to cuck you

      If AMD's stuff is good they'll bump it to 12gb and the 4090TI will have 32gb and the 4080TI will have 20gb

  47. 2 years ago
    Anonymous

    My 1660 ran Elden Ring and will probably run Starfield. Guess I won't upgrade for now.

  48. 2 years ago
    Anonymous

    So uhh could the 4080 16gb do 4k120hz?
    t. OLED TV owned

    • 2 years ago
      Anonymous

      Sure, but since modern games all have shit optimization you will need to run everything at low settings.

    • 2 years ago
      Anonymous

      I run anything i want 4k120 with a 3060ti
      I don't give a shit about cyberpunk or whatever modern shit though

    • 2 years ago
      Anonymous

      my laptop with a basically 6700XT does 4K120 LG OLED
      Do research into how demanding your games are and what you need to run them at 4K120

  49. 2 years ago
    Anonymous

    bastards will kill my wallet

  50. 2 years ago
    Anonymous

    The rumoured cards were way too powerful. They probably couldn't get the watt spikes under control either. Given energy costs more than gold now they figured it best to drop the original 4080. Good thing for me too 'cause my 3070 will last longer now tbh. It does suck however that they will charge more for what they were previously gonna charge as a 4070 but that's njudea for you.

  51. 2 years ago
    Anonymous

    [...]

    sorry I am not black, so you are missing the shot by a long distance here.

  52. 2 years ago
    Anonymous

    that's cool but I'm still sticking to my 1080ti

  53. 2 years ago
    Anonymous

    Maybe if amd weren't such a fricking garbage dump of a company we wouldn't be forced to buy nvidia

  54. 2 years ago
    Anonymous

    [...]

    I thought you people also hated browns? now they don't exist, only whites and blacks? you make no sense.

  55. 2 years ago
    Anonymous

    I don't care about the 4000 series but can I at least expect the price of the 3060 to drop in the coming months?

    • 2 years ago
      Anonymous

      no
      You will pay gorillions for shit hardware and you will be happy

  56. 2 years ago
    Anonymous

    Why should I care what they call it? I’ll buy one (or not) based it’s price and it’s performance relative to alternatives (previous gen, AMD).

  57. 2 years ago
    Anonymous

    >upgraded to a 3080Ti last month
    guess I'm sticking with this bad boy for the next 5-6 years

  58. 2 years ago
    Anonymous

    >Laughs in 3070

  59. 2 years ago
    Anonymous

    I'll just buy a cheap 3060 or a 3070 when 4xxx drops.

    • 2 years ago
      Anonymous

      they won't be so cheap if performance increase will be measly 10%

  60. 2 years ago
    Anonymous

    >tfw got a 6900xt at retail price when they came out
    >still get 90+ fps on max settings on everything

    No point in upgrading until I can't maintain 60 fps

  61. 2 years ago
    Anonymous

    30xx is a 40xx beta test
    now we will see true 4k / rayracing

  62. 2 years ago
    Anonymous

    What's the AMD equivalent of a 3060 ti/3070?

    Are they doing better in power consumption/heat?

    • 2 years ago
      Anonymous

      Tbh get the 6800 non xt both 3070 and 6700XT are at 230W while the 6800 is at 250W but performs 25-35% better than the 3070 and also performs better than a 3070ti which uses 290W

    • 2 years ago
      Anonymous

      RX 6700XT/RX 6800
      they are priced (well at MSRP, real world price is a shitshow) higher but also do notably better in rasterization + more VRAM

  63. 2 years ago
    Anonymous

    sirs ive got a rtx 3090 that i bought to test i can send it back for 5 days and get full price back should i return it? im thinking yes

    • 2 years ago
      Anonymous

      Spending $1000 on a card when in 2 months you'll get a better card for the same or slightly cheaper seems bad. I'd say yes

  64. 2 years ago
    Anonymous

    Maximum greed

  65. 2 years ago
    Anonymous

    i ordered a 3080 ti

    deep down I wanted a cooler 3070 with less temps less watts

  66. 2 years ago
    Anonymous

    This isn't the 4070. The 4070 had a tdp of 285 watts and lower specs. It will probably release next year. You won't get one of these cards before then anyway as it will be a paper launch until they get rid of the 3k series.

  67. 2 years ago
    Anonymous

    >3080Ti
    >32GB 3600
    >Ryzen 7 5800X3D
    >1440p 165Hz
    >4K TV for couch single player

    It's almost like i care for those fricking new scammer cards.

    • 2 years ago
      Anonymous

      and all that power to play the shitposting game

      • 2 years ago
        Anonymous

        I'm very good at it

  68. 2 years ago
    Anonymous

    Let me know when I can get a card that can do 4k at 60 fps in AAA games, 4k 144 fps in competitive games, doesn't draw too much power, and is $300.

    Otherwise, not interested.

  69. 2 years ago
    Anonymous

    >970
    >4670k
    >still no game i need to upgrade for
    Maybe that harry potter game will be good.

    • 2 years ago
      Anonymous

      I'm on a 4790k and gtx 970 at 1080p/60fps and I've yet to run into a game that I can't run on max settings

      • 2 years ago
        Anonymous

        I can't max resident evil 2 remake, anon

  70. 2 years ago
    Anonymous

    Don't care about any of this, I will not buy a GPU with more than 200W TDP

  71. 2 years ago
    Anonymous

    israelitery aside, are we hitting the limits of thermodynamics or is Nvidia just lazy?

    • 2 years ago
      Anonymous

      Both
      >16K shaders (ALUs)
      >2.5GHz-3GHz
      That was always going to draw insane amounts of power, no way around it

      However, nvidia builds massive monolithic chips. Those chips are highly likely to suffer from silicon defects, which means the amount of voltage (thus power) needed to push them to those high clock speeds is increased. AMD is moving to chiplets, where they basically split the single chip into many, smaller sub-chips with each part of the GPU on them. smaller chiplets = much easier to get a strong bin/low defect chip = less voltage required = less power consumed.

  72. 2 years ago
    Anonymous

    >4070 is now the 4080 12GB
    >4080 16 GB the same
    >4090 is untouched
    could be worse

  73. 2 years ago
    Anonymous

    Can't imagine anything more cringe than spending $1000+ on a PC part to play your little video games with

    Get a life you fricking loser

    • 2 years ago
      Anonymous

      you're on the videogames board of Ganker, you're either on the wrong website or you failed as a normalgay and are still coping about it

    • 2 years ago
      Anonymous

      sounds like poorgay cope

  74. 2 years ago
    Anonymous

    kek, they have done this so many times in prior gens that you are probably paying XX90ti prices for what actually would have been a low-midrange piece of shit

Leave a Reply to Anonymous Cancel reply

Your email address will not be published. Required fields are marked *