No more nVIDIA GPUs.

>Nvidia is no longer a graphics company

https://www.digitaltrends.com/computing/nvidia-said-no-longer-graphics-company/

>"Greg Estes, the vice president of corporate marketing at Nvidia, said: “[Jensen] sent out an email on Friday evening saying everything is going to deep learning, and that we were no longer a graphics company. By Monday morning, we were an AI company. Literally, it was that fast.”

Unattended Children Pitbull Club Shirt $21.68

The Kind of Tired That Sleep Won’t Fix Shirt $21.68

Unattended Children Pitbull Club Shirt $21.68

  1. 5 months ago
    Anonymous

    Gaming = AMD / Radeon / ATI
    This is a fact since 2004

    • 5 months ago
      Anonymous

      >Gaming = AMD / Radeon / ATI
      >This is a fac-ACK

      • 5 months ago
        Anonymous

        works on my machine

      • 5 months ago
        Anonymous

        The biggest annoyance is that all my custom tuning settings get erased and I have to load my exported profile whenever there's a driver time out. Happens about once a week, which I assume is because my UV isn't completely stable.

    • 5 months ago
      Anonymous

      Radeon isn't even good now, to say nothing 10 years ago

      • 5 months ago
        Anonymous

        R9 280x 3GB is way better than the shit nvidia was selling in 2014.

        • 5 months ago
          Anonymous

          No it wasn't. I owed one. It was literally a rebrand of the HD 7970. Didn't even get Freesync support

    • 5 months ago
      Anonymous

      >graphic card that is great at games but can't do AI
      >graphic card that is pretty good at games, and also does AI
      ATI has no chance

    • 5 months ago
      Anonymous

      fippy bippy

    • 5 months ago
      Anonymous

      Based. Pic related is still the greatest video card ever made and nobody will ever change my mind.

    • 5 months ago
      Anonymous

      nothing epitomizes nu-gaming so well as two pajeet shill farms fighting a digital turf war over the soulless megacorps which hired them.

  2. 5 months ago
    Anonymous

    what do the brackets mean in this context: [Jensen]
    I've seen it plenty before on other articles but I can't deduce it

    • 5 months ago
      Anonymous

      The writer filling in or replacing words from a quote to provide context

      • 5 months ago
        Anonymous

        that's what it means yes but the sentence doesn't make sense that way

        The person didn't directly say Jensen, but that is who he is referring to.
        [...]
        >Linux gaming

        like replace Jensen with ''he'' and the sentence still makes sense. but I've seen other examples where paraphrasing isn't possible

        In a real conversation, you'll mention a persons name once and then continually refer to that person as "they", "them", "him", "her", etc. But if you are trying to quote that person halfway through that conversation, but dont need the full context, you replace these pronouns with the person's name [Like This].

        its basically adding context.

        maybe I'm just schizoing but yes, that's what it means and how it works. but I can swear I've seen examples of something like that not being possible while the writer still uses brackets

        • 5 months ago
          Anonymous

          or replace Jensen with "Huang"

          or "Long Dong Huang"
          or "Ching Chong Huang"

        • 5 months ago
          Anonymous

          You are schizoing. Here's an example of how it works.

          The interview: "Moot? Yeah, I know him well. He's a bit of a homosexual. Did you know he got fired from Google for owning 12 dakimakuras?"

          The quote: "Did you know [Moot] got fired from Google for owning 12 dakimakuras?"

          • 5 months ago
            Anonymous

            the people in my walls thank you for that example
            >''Did you know [Moot] got fired from Google for owning 12 dakimakuras?"
            somehow this makes no sense to me unless I think about it 5 times
            like if you read the article, paragraph or whatever it's quite obvious we are talking about moot. putting the context in brackets makes it a bit redundant and *for me* more difficult to follow.
            also I just realized, it's a recent phenomemon that I've seen the brackets being used (imo) inappropriately, I'm 100% sure I've read some AI generated articles which is why it feels wrong to me

            >enter thread
            >10 out of 17 replys are discussing grammar

            it's a garbage thread anyways, the anons helping me are doing a great thing (unlike this thread)

            • 5 months ago
              Anonymous

              Most people arent going to read the entire article. And most articles aren't going to print an entire interview verbatim.

              • 5 months ago
                Anonymous

                >Most people arent going to read the entire article
                yeah that makes sense, they just read the header and that's all

                > also I just realized, it's a recent phenomemon that I've seen the brackets being used (imo) inappropriately,
                nope, been used for decades.
                You probably only just started noticing it.

                I don't mean the use of brackets, I mean the inappropriate use of them

                anyways I'm gonna shut up now, if I see it again I'll just make a thread about gpu's or something

            • 5 months ago
              Anonymous

              > also I just realized, it's a recent phenomemon that I've seen the brackets being used (imo) inappropriately,
              nope, been used for decades.
              You probably only just started noticing it.

            • 5 months ago
              Anonymous

              >also I just realized, it's a recent phenomemon that I've seen the brackets being used (imo) inappropriately

              well, i dont know what context you have been seeing it in but from my experience, ive seen people use brackets as there way to insert their own editors notes. kinda like the shit you see in anime fan translations.

              • 5 months ago
                Anonymous

                >brackets as there way to insert their own editors notes
                maybe that's it and that's why it's so confusing to me, it clearly wasn't only just paraphrasing a contextual person/name like the proper usage

        • 5 months ago
          Anonymous

          without the bracket, the og sentence would read
          >He sent out an email on Friday evening saying everything is going to deep learning
          and you wouldn't have a clue who is being talked about

          • 5 months ago
            Anonymous

            >and you wouldn't have a clue who is being talked about
            Wrong. Autists know.

            • 5 months ago
              Anonymous

              I plug my controller to port 2, controlling Sally, and look in the box

        • 5 months ago
          Anonymous

          Then maybe you should bring that up where you think it's happening, and not expect anyone to care about your dumb ass swearing you've seen it... somewhere.

    • 5 months ago
      Anonymous

      The person didn't directly say Jensen, but that is who he is referring to.

      You should be using AMD on Linux anyway.

      You either support freedom or you're Black person cattle.

      >Linux gaming

      • 5 months ago
        Anonymous

        >Linux gaming
        Minecraft runs so much better on Linux that it's not even funny.
        Plus Linux runs Crysis as well as Windows, your argument is invalid.

        • 5 months ago
          Anonymous

          Linux runs highly popular games from more than 10 years ago? No way. . .

          • 5 months ago
            Anonymous

            It is 2023, GNU/Linux based Gaming OSes often run new game releases better than the compatibility layer in Microsoft's legacy OS for business.

        • 5 months ago
          Anonymous

          That's great but I'll still only be able to use 10% of my library. Call me when it's closer to 80%. Emulating windows all the time is more moronic than just using windows.

          • 5 months ago
            Anonymous

            >Emulating windows all the time is more moronic than just using windows.
            Ironically modern Windows is simply 'emulating' Windows in the exact same way that GNU/Linux based Gaming OSes are 'emulating'.
            It is exactly why performance is already often on-par, and why at least for my library I have a single digit number of games that won't work.

          • 5 months ago
            Anonymous

            >Call me when it's closer to 80%.
            *calls you*
            (Gold means it works perfect but you need to adjust something, silver means it works good enough with minor issues)

            • 5 months ago
              Anonymous

              >Gold means it works perfect but
              So doesn't run perfect, and there is always the risk that a scene will not play and you need to go back, waste time troubleshooting and load back your save just for that 1 scene. Why even go trough all that when you could just run windows like a normal person? there are literally no upsides.

            • 5 months ago
              Anonymous

              Another way to interpret that
              Platinum - better than Windows 7
              Gold - Better than Windows 10
              Silver - Better than Windows 11

              The fact people pretend Windows is still Windows and that games just work is laughable.

              • 5 months ago
                Anonymous

                Works on my machine

            • 5 months ago
              Anonymous

              Do Linuxafgs really think this images is able to convince anyone that Linux is good enough for gaming?

              • 5 months ago
                Anonymous

                They gotta grasp at every straw they can. Here is the reality btw. They basically install a contrarian's OS to mostly play normalgay games.

              • 5 months ago
                Anonymous

                That's so fricking pathetic holy shit. And the worst part is that they do it for free too. Reminds me of a particular group of individuals.

              • 5 months ago
                Anonymous

                Ranking everything in the Steam Catalog is a massive effort, and just because it hasn't been ranked doesn't mean it doesn't work.
                It only looks bad until you realize nobody is checking compatibility with the legacy compatibility layer in modern Windows at all.
                There is no similar resource, perhaps because there is no maintenance of it. No point checking compatibility if nobody can fix it when its broken.
                Developers on modern Windows are left to work around compatibility issues on their own.

                Whole reason why we now have the comical situation where developers or users drop in DLLs developed for gaming on Linux into their Windows games to get them working.
                Hell, even Intel basically gave up and just used DXVK for their Windows driver.

              • 5 months ago
                Anonymous

                That's so fricking pathetic holy shit. And the worst part is that they do it for free too. Reminds me of a particular group of individuals.

                It makes sense when you think about it. Linuxgays spend more time on ERPing on Ganker and Discord than they do actually playing vidya.

              • 5 months ago
                Anonymous

                [...]
                It makes sense when you think about it. Linuxgays spend more time on ERPing on Ganker and Discord than they do actually playing vidya.

                That's so fricking pathetic holy shit. And the worst part is that they do it for free too. Reminds me of a particular group of individuals.

                >the mere existence of linux causes this much seethe
                why? I can feel the anger behind your keyboards. if you don't like it just don't use it

              • 5 months ago
                Anonymous

                That's so fricking pathetic holy shit. And the worst part is that they do it for free too. Reminds me of a particular group of individuals.

                >noooo they didn't test every shovelware assetflip money laundering scheme game
                How many games in your library aren't listed?

              • 5 months ago
                Anonymous

                >b-but ...
                KEK

              • 5 months ago
                Anonymous

                How many?

            • 5 months ago
              Anonymous

              >perfect, but
              lol

              • 5 months ago
                Anonymous

                Your PC works perfect but you need to plug it in first.

        • 5 months ago
          Anonymous

          >Minecraft

    • 5 months ago
      Anonymous

      he didn't ask for this

    • 5 months ago
      Anonymous

      In a real conversation, you'll mention a persons name once and then continually refer to that person as "they", "them", "him", "her", etc. But if you are trying to quote that person halfway through that conversation, but dont need the full context, you replace these pronouns with the person's name [Like This].

      its basically adding context.

    • 5 months ago
      Anonymous

      the writer is adding context. the name wasn't in the quote, but without the name, the quote wouldn't make much sense by it's own. let me make an example
      >but [Anon] can't deduce it

    • 5 months ago
      Anonymous

      its when you are adding words to quote to give context or make it make sense
      if they leave out a word due to a typo and it the quote doesn't make sense you can add [some words] so its coherent, other times its just to be clear who/what its talking about from an excerpt

  3. 5 months ago
    Anonymous

    You should be using AMD on Linux anyway.

    You either support freedom or you're Black person cattle.

  4. 5 months ago
    Anonymous

    it's over

  5. 5 months ago
    Anonymous

    all the consoles use amd cards anyway

    • 5 months ago
      Anonymous

      >nogaems5 and Flopbox use AMD GPUs
      >majority of games are designed from the ground up for consoles with PC has a side note
      >AMD GPUs on PC constantly have driver issues that the consoles dont
      how does that work?

      • 5 months ago
        Anonymous

        Just like there are drivers for AMD hardware exclusive to GNU/Linux based OSes, game consoles aren't running the Windows drivers.

      • 5 months ago
        Anonymous

        >driver issues
        Tech illiterate morons parrot this from watching too much shills on youtube.
        Reminder that 99% of people are utter morons.

        • 5 months ago
          Anonymous

          Yeah, it's so fun never knowing if the next driver update is going to be better or worse. I remember how VR performance got worse over time with the 6900xt, which was less than half the price of the 4090 I ended up getting. Do you remember the 5700? The 480? No CUDA support means their GPUs are permanently DOA for my PC, but for people who don't use VR or more than one monitor, the price is worth sticking with a stable driver for the games you play and never updating them.

          • 5 months ago
            Anonymous

            You dont remember shit, tech illiterate homosexual.
            If you dont know how to read and install/uninstall drivers or make a proper multimonitor setup, youre nothing but a casual user. Your opinions are worthless.

      • 5 months ago
        Anonymous

        Majority of AMD driver complaints come from early adopters of a new generation of GPUs. After 6 or so months most of those issues are resolved and it's up to the developer to not shit the bed in a game update which at that point, though rarely, also can affect NVIDIA GPUs. Way more common to happen for live service games in general.
        >inb4 "holy AMD cope"
        I own an RTX 4070 lol

  6. 5 months ago
    Anonymous

    don't care
    still buying the 4070

    • 5 months ago
      Anonymous

      Mine's arriving Wednesday.

      • 5 months ago
        Anonymous

        don't care
        still buying the 4070

        >$550 gpu with 12GB vram
        ROFL just like cuckolds that bought 3070 8GB and 3070ti 8GB

  7. 5 months ago
    Anonymous

    So AMD will just boost their prices to NVIDIA levels. Or do they mean to focus all in on DLSS?

  8. 5 months ago
    Anonymous

    >enter thread
    >10 out of 17 replys are discussing grammar

    • 5 months ago
      Anonymous

      grammar is more important than whatever bullshit nvidia is doing.

    • 5 months ago
      Anonymous

      so 2k for 5090 ?

      we are on reddit my fellow Ganker user

  9. 5 months ago
    Anonymous

    >Novidia

    Microshit is next , they already implemented AI in new windows, it will probably be all AI by the next iteration. They'll extend these services to Xbox so that any normie can game using the power of their network to stream games from their own PC's. This way they don't have to develop consoles anymore and can simply charge you more for subscribing and buying digital games (Renting). They probably already partnered with Nvidia for it.

    • 5 months ago
      Anonymous

      Sadly, that's a best case scenario for where the gaming industry and the personal computer space will be in five years.

    • 5 months ago
      Anonymous

      If they truly give up gaming, they're going to be bit when inference compute requirements drop dramatically the way technology always works. Its already happening that you can run reasonable LLMs locally

      >Novidia
      lel

    • 5 months ago
      Anonymous

      Oh boy, just wait until Windows unveils OS on cloud, I don't think you realize how close it is. It'll work too, I still remember how fast people just took it with Office 365.

  10. 5 months ago
    Anonymous

    I'm not gonna read this article. Is this real or made up? They would cut profits heavily if no one is buying their hardware.

    • 5 months ago
      Anonymous

      It would be interesting to see them pull a hard pivot out of the discrete hardware market, but it's going to leave one hell of a vacuum in that space. I do find it kind of hilariously petty how he mentions Intel's ARC efforts and AMD in the article. He seems to fail to mention that AMD completely ate their lunch in the integrated market, since their chips are powering all three of the current consoles, and likely will for the foreseeable future, especially since AMD seems to be more amenable to open drivers, warts and all, and an unsurprising number of thin clients use AMD thanks to their solid iGPUs providing better performance per watt compared to Intel's offerings.

      It seems to be legit, but it's also unsurprising. Despite the weight laid on them as a GPU manufacturer, they lack one major factor that both of their rivals have, a CPU division, and if they think the future is going to be AI and cloud computing, then it's a smart move. It's a shame I've seen this whole thing repeated a half dozen times from memory since the rise of broadband internet, and it never seems to end well. Remember Stadia? OnLive? Any of the other big interactive streaming services that were going to revolutionize the market, only to fall flat? How many people actually use Xbox, Playstation, or Geforce's mobile game streaming services? Do they really think that AI is going to somehow magically fix the problems that the technology has historically had since its inception? It'll be interesting to see how it plays out in any case. On the off chance they do pivot out of discrete GPU manufacturing, it'll probably put AMD in their position, with Intel becoming the new definitive entry-level budget-friendly option as their knowledge base for GPU programming and driver coding improves.

      • 5 months ago
        Anonymous

        Microsoft seems like they may be preparing to throw Nvidia a bone by making one last push to get consumer-grade Windows hardware on to ARM, with an Nvidia Windows ARM system seeming likely, but that system would completely lose the existing PC gaming library, and likely only play a limited number of Xbox games (or all the new Xbox games if Nvidia gets the next Xbox). Which to be fair PC gaming has never been a real focus for Microsoft, Windows is actually bad at 'Windows gaming' and Microsoft have been openly talking about dropping what they see as 'legacy' support for years now. This is why Valve has been so freaked out about getting GNU/Linux based OSes adopted for PC gaming.

        • 5 months ago
          Anonymous

          That's very interesting to know, and does make a lot of sense. I know MS was very bullish on ARM for a while, since as you said they're trying to move past the legacy instruction set of x86. It would be a shame to see MS exit the gaming market entirely; even though I've only owned an xbox once for about a month, GP is a boon for PC gamers who want to try out a lot of games for a low rental price. If things are headed that way, it seems like Valve is going to need to move up whatever plans they have for standardizing SteamOS for desktop to ease the process of migrating PC gamers over to a new standard. As it stands, much as I love GP, it might be worth it for me to try switching over in full to something like ChimeraOS, especially since I've recently jumped back from team green to team red for my GPU.

          • 5 months ago
            Anonymous

            The rumors I've heard say that the big hold up on a general public release of SteamOS is Nvidia. Valve know that the minute they officially release SteamOS 3 a bunch of users (and review sites) are going to run off to immediately install it on their PCs, and that most of those PCs have Nvidia hardware.
            Where a lot of that hardware is still pre-Vulkan, Nvidia's own driver support is terrible, and NVK (open source Nvidia vulkan driver) isn't up to speed and only works on Nvidia's newest hardware.

            So basically don't expect Valve to ship out an installable SteamOS for general users any time soon. Maybe expect some more Steam Deck style products. One rumor that seems credible is a new ~$500 Steam Machine using a more powerful AMD SOC that can act as a living room gaming PC, wireless VR base station, or desktop PC. Although you can get the same basic user experience with ChimeraOS or a number of other modern GNU/Linux based Gaming OSes.

            • 5 months ago
              Anonymous

              Yeah, that tracks. Nvidia has historically been awful with Unix-based anything. That second rumor could be very interesting though, and seems very plausible if they want to make a second push for Steam Machines while also advancing the VR space that they've made a decent investment in.

  11. 5 months ago
    Anonymous

    Please God be true...

    • 5 months ago
      Anonymous

      AMD becoming the monopoly of graphics would be a bad thing.

      >Novidia

      Microshit is next , they already implemented AI in new windows, it will probably be all AI by the next iteration. They'll extend these services to Xbox so that any normie can game using the power of their network to stream games from their own PC's. This way they don't have to develop consoles anymore and can simply charge you more for subscribing and buying digital games (Renting). They probably already partnered with Nvidia for it.

      Grim future.

      • 5 months ago
        Anonymous

        Because the drivers for AMD hardware have been developed in the open, Intel has been able to rapidly improve.
        Nvidia is just facing reality, MI300 shows that there is no future for discrete GPUs. They don't want to anchor their share price to the discrete Windows gaming card market.
        Especially when Microsoft is almost certain to exit the consumer market, and Nvidia is badly behind on modern GNU/Linux based Gaming OSes.

        • 5 months ago
          Anonymous

          >Microsoft is almost certain to exit the consumer market
          The shit you read here...

          • 5 months ago
            Anonymous

            People couldn't believe IBM would stop making the IBM PC either and only sell systems to business.
            What do they really have to lose? It isn't like you ever paid them for Windows.

          • 5 months ago
            Anonymous

            I think he's possibly referring to the upcoming cloud version of Windows. If Microsoft only sells cloud subscriptions of Windows in the future, Nvidia won't be selling high-end consumer GPUs anymore. Unless Linux becomes really big for gaming that is.

          • 5 months ago
            Anonymous

            it's the wet dream of every linux user, they believe that microsoft will be cloud only and linux will take over, when in reality I could see Apple taking over the PC industry as they already have partnerships with a ton of vendors and manufacturers.
            Linux is still fighting over which package manager is better and too busy being stubborn in adding a competent UI across every distro since they think Terminal=GOD.
            A few months ago MS and Nvidia got into a stronger partnership so whatever Nvidia decides to do will run better on Windows and be ported to other platforms later or not at all.

            • 5 months ago
              Anonymous

              >when in reality I could see Apple taking over the PC industry
              That would be a problem when Apple no longer produce 'IBM compatible PCs', or support any standard APIs. Their use for PC gaming has been plummeting. It is why more users now run Steam on GNU/Linux based OSes than MacOS.

              >Linux is still fighting over which package manager is better
              Developer focused systems yes, because that is how GNU/Linux based OSes progress. For end-user systems the current accepted solution is an immutable base system, used to run containerized applications. ChromeOS, SteamOS, SilverBlue, they all basically work the same way even if the way they were built and the exact container solution differs. Same way Android looks like the user friendly distributions of the era that saw it fork off.

              • 5 months ago
                Anonymous

                >if
                >but
                >when
                That's a lot of ifs, linux is free and can't be monetized properly, Apple has the upper hand here. I get that you like linux but there is a point where you have to realize that unless it becomes 1 simple OS without so many distros, a competent UI for everyone to use, basically Windows 2 and 99%+ of games are in Platinum status, nobody is going to care or want to deal with the problems.

              • 5 months ago
                Anonymous

                >linux is free and can't be monetized properly
                >Apple has the upper hand here
                How exactly? GNU/Linux is a Free UNIX-like OS, MacOS is a BSD, which is also a free UNIX-like (genetically UNIX) OS.
                MacOS no longer runs on hardware with native binary compatibility with PC games, leaving that to other BSD distributions.
                It also lacks Vulkan, which is the full power API for modern 3D graphics.
                Unlike Metal with is limited for phones, and Direct3D which is limited for consoles.

                >unless it becomes 1 simple OS without so many distros,
                From consumers perspective this has already happened. No mainstream user buying a phone, tablet, or handheld cares that Android, ChromeOS, or SteamOS are all Linux distributions.
                This isn't even true for Apple's OS which has at least 4 distinct versions not even counting other BSDs, or Windows which has a half dozen or more.

                >nobody is going to care or want to deal with the problems.
                Seems unlikely since people already deal with constant problems with modern "Windows" OSes which run all applications through a problematic basically unmaintained compatibility layer, on a system that is clearly an afterthought for the company.
                Only real difference is that Valve is doing the hard work of testing games for their compatibility, and rating them. Like 75% of things that gets games down-ranked are things that also happen on Windows.

              • 5 months ago
                Anonymous

                Maybe when Windows 13 comes out your dream will come true.

              • 5 months ago
                Anonymous

                I remember when Windows was 95%, MacOS was maybe 4% and all the Linux OSes and BSDs were under 1%.
                Now with Linux OSes at 43%, NT OSes somewhere around 30%, and the rest being Darwin or other BSD it looks like my dream is pretty close to being true.
                Maybe real 'desktop' use of Linux is still low single digits, but real 'desktop' use of Windows isn't much better.

                The last stats I saw showed that most Windows users were already running the 'cloud advertising' versions. Really only a matter of time now. How much longer can people hold out on an unmaintained platform not supported by Microsoft?
                Barring a surprise open source release, Windows will soon be as dead as Classic MacOS. Desktop users will have no choice but to find something else.

              • 5 months ago
                Anonymous

                You assume that in 10 years people will still be running machines the same way as they use it right now, MS and Nvidia are not just sitting around frozen for 10 years, they already agreed to move the industry forward in partnership.
                Like I said, if linux continues to fumble around their approach to end users it's never going to take off, the reason Android and other mobile operating systems are widely used nowadays is because they are brain dead and easy to use touch based systems, even toddlers are able to use a phone, so if Linux decides to shift entirely to this model, then yeah, your dream might become true sooner than later.

              • 5 months ago
                Anonymous

                >You assume that in 10 years people will still be running machines the same way as they use it right now
                Not everyone no. The vast majority of people do not want or need fully capable desktop operating systems. I expect them to be using the latest and greatest user-friendly systems, and even if they are GNU/Linux or BSD under the hood nobody who uses them cares.
                However for the people who need a serious computer, that has stayed remarkably the same. UNIX-like systems are still recognizable.

                >Android and other mobile operating systems are widely used nowadays is because they are brain dead and easy to use touch based systems, even toddlers are able to use a phone
                Exactly, so if UNIX-like OSes completely own the high end, and can also be made so easy to use even a toddler can understand them, what space exactly is there left for Microsoft to sell an OS to OEMs for a premium?
                The only thing Microsoft have going for them are business customers with legacy software to run, and they'll be better served by a hosted solution.

                >MS and Nvidia are not just sitting around frozen for 10 years,
                I don't expect them to, I expect them to do what is best for their shareholders, the same way IBM did. The market has simply evolved in a way that doesn't leave them much room for their current business models within the consumer space.

              • 5 months ago
                Anonymous

                >However for the people who need a serious computer, that has stayed remarkably the same.
                So Windows? Azure, AWS, etc. exist, things evolve and change.

                >if UNIX-like OSes completely own the high end
                You're forgetting the part where Nvidia and Microsoft are leading the race, if anything, cloud based mobile devices will be easier to adapt to.

                >the same way IBM did
                This assuming AI and deep learning doesn't take off at all and everything crumbles and we go back 10 years into the past where people don't care about that, which is not true even today because DLSS and Path Tracing do rely on AI and that is where Nvidia will be focusing on. So I don't really think this will be the case.

              • 5 months ago
                Anonymous

                >things evolve and change.
                No doubt, but in case you didn't notice two of the things you listed are hosted UNIX server environments, and the third looks like it is on its way to being a hosted compatibility layer service.
                PC gaming can survive because it can now be run on free software. Nobody is going to care about it, in the same way that almost nobody cares about vinyl records or printed books and newspapers.

                >forgetting the part where Nvidia and Microsoft are leading the race
                Are they though? Microsoft owns a remnant of a former empire that is crumbling. 95% to 30% over the last 20 years is not a good trend line.
                Nvidia has a majority share within the discrete GPU market, but there may not be a mainstream consumer market for them much longer. Not so much because having powerful GPUs won't matter, but because for performance you are going to want CPU, GPU, and memory all within a single integrated solution.
                Nvidia can't offer that, at least not in a way that will matter to PC gamers who need AMD64 and legacy intel x86 binary compatibility.

                >This assuming AI and deep learning doesn't take off
                How? AI and deep learning is far more likely to happen for the mainstream consumer market as a hosted web service, running on hardware that looks like the AMD MI300a than it is a consumer grade Nvidia 4090.

              • 5 months ago
                Anonymous

                >linux 43%, NT 30%
                In what timeline and universe?
                The one where the year of the linux desktop happened in 2003?

              • 5 months ago
                Anonymous

                He's counting mobile shit and anything that runs any sort of linux flavor, I bet he thinks iOS counts as linux too.

              • 5 months ago
                Anonymous

                Of course not, iOS is a Darwin BSD distribution.

                The main difference between UNIX-like OSes on ARM, and UNIX-like OSes on AMD64 is the binary compatibility.
                This is the reason why NT never took off on ARM because it lost all its software, but it is less of a big deal for open UNIX-like OSes because most of the software is open source and easily ported.

                Steam for Linux is really the first time Linux has been really tied to an architecture. Of course breaking Steam is also the reason why Microsoft and Apple both want to move to ARM.

              • 5 months ago
                Anonymous

                This universe, where the desktop market has shrunk over the years to a small niche market.
                Actual desktop users of Windows who care about using a desktop OS are a tiny minority.
                Not much larger than users who build their own systems.

              • 5 months ago
                Anonymous

                People would rather use Chrome or Mac OS over your piece of crap nobody cares about.

              • 5 months ago
                Anonymous

                I bet a significant amount of those unknowns are Linuxes

              • 5 months ago
                Anonymous

                Yep, even within Microsoft's hand-tuned 'desktop' category which counts every Windows OS as 'desktop' while shuffling the majority of UNIX off into other categories they still have to split GNU/Linux based OSes in 3.

              • 5 months ago
                Anonymous

                ChromeOS is a GNU/Linux distribution, and if you dig into the OS X numbers there is a lot of legacy Macs which aren't necessarily going to be replaced with ARM Macs. Kind of like how if you dig into the Windows numbers the illusion of continued Microsoft Windows dominance in desktops is dispelled.
                The vast majority of Windows systems now, aren't traditional desktop systems running real desktop OSes. Barely anyone runs desktop Windows anymore.

              • 5 months ago
                Anonymous
              • 5 months ago
                Anonymous

                Not him, you are absolutely delusional. ChromeOS is anything but GNU. It's absolutely moronic to say it is

              • 5 months ago
                Anonymous

                Sorry anon, but the OS family is called GNU/Linux. Doesn't have to be Trisquel level adherence to Stallman for that to be true.
                ChromeOS is a GNU/Linux, just like Android is Android/Linux.

              • 5 months ago
                Anonymous

                >but the OS family is called GNU/Linux
                No it's not, moron. Here let non-free google teach you something
                https://unix.stackexchange.com/questions/269487/is-chromium-os-a-gnu-linux-distro

              • 5 months ago
                Anonymous

                Sorry anon. Random people on the internet aren't authoritative on this subject.
                Also badly out of date, ChromeOS has moved significantly closer to the rest of the GNU/Linux ecosystem in the last 7 years. It used to be less standard than it is today.

              • 5 months ago
                Anonymous

                May as well claim that Windows is GNU
                The link has a clear explanation of why ChromeOS isn't GNU, while the best you can do is assert that it is with no evidence or logic
                Because it isn't

              • 5 months ago
                Anonymous

                >The vast majority of Windows systems now, aren't traditional desktop systems running real desktop OSes. Barely anyone runs desktop Windows anymore.
                I have read this statement at least 8 times by now, and still do not understand what the hell you saying.
                Are you trying to say that traditional home box and monitor desktops have been declining, and that this decline has resulted in more home desktop linux systems?
                Because if you are, I hate break it to you, but that phenomena is not exclusive to windows. iMac sales have trailed the Macbook since its inception.
                It does not matter in slightest how many servers, gateways, and embedded devices run linux when its presence in the consumer space is still a joke, at best.
                Also
                >lumping android in with desktop gnu
                You know damn well an android device with gplay services is about as much a linux device as an ARM mac is a BSD machine.

              • 5 months ago
                Anonymous

                So we can agree that there are 'desktop' operating systems and there are operating sytems that aren't 'desktop' right?
                What I am saying is that there are a lot of NT distributions that if they were anything other than Microsoft products would not be classified in the 'desktop' category.
                The fundamental issue here is that NT is lumped, everything else is split. Often for arbitrary reasons.
                When you dig into the numbers the divisions in the categories seem designed to reinforce the 'Windows' count within what they define as 'desktop'.
                Where desktop is very narrowly defined for Linux systems, but very broadly defined for NT.

                >as much a linux device as an ARM mac is a BSD machine.
                I'm glad we are agreed then that ARM Linux and ARM Darwin are both properly included with the rest of the OSes that use their kernel.
                At least if you are going to try to claim that every version of Windows, including Windows ARM counts as 'desktop'.

              • 5 months ago
                Anonymous

                Ah ok I see now.
                You are 100% insane, trying to say laptops shouldn't count as part of window's marketshare. gotcha.

  12. 5 months ago
    Anonymous

    I guess I'm buying Intel then?

  13. 5 months ago
    Anonymous

    gabenOS is our only hope

  14. 5 months ago
    Anonymous

    as long as this makes them put 120GB memory on their next consumer chips without improving any other part of the architecture -beside larger memory handling- i'm very cool with it
    the further we go in time the less relevant these meme graphical improvements got. unlike AI

  15. 5 months ago
    Anonymous

    >The email where Huang declared Nvidia an AI company came after the development of AlexNet.

    >AlexNet was the neural network architecture that won a computer vision challenge in 2012

    wow, it's fricking nothing. Guess they found out about an internal email and decided to write an article rehashing what everyone already knows.

  16. 5 months ago
    Anonymous

    of course. that's what's been making them a ton of money lately

  17. 5 months ago
    Anonymous

    Honestly it's a good thing that they're not going for GPUs now, graphics have reached a plateau of diminishing returns, the 4090 RTX is basically overkill for every videogame on the market and will probably last two decades before there's a demand for new GPUs.

    • 5 months ago
      Anonymous

      >and will probably last two decades
      lmfao

      • 5 months ago
        Anonymous

        I mean the GPU will be able to run games well enough for the next two decades, as for the hardware's actual durability who knows.

    • 5 months ago
      Anonymous

      >4090 RTX is basically overkill for every videogame
      It barely runs the new Avatar game at 30 fps maxed out

      • 5 months ago
        Anonymous

        Blame it on devs, who are literally throwing optimization out of the window and bloating their games with every last single possible useless feature ever.

  18. 5 months ago
    Anonymous

    they're still making and selling GPUs you brainlets

    • 5 months ago
      Anonymous

      For the moment, but if you are old enough you remember when computers were put together from multiple chips, and then those chips got consolidated into fewer chips.
      The future looks like AMD's new AI hardware. GPU, CPU, and Memory, all in a single package. AMD can do that, Intel can too, Nvidia can't.
      They're going to get squeezed from both ends because they can't make AMD64 CPUs.

      • 5 months ago
        Anonymous

        >Nvidia can't.
        lol

        • 5 months ago
          Anonymous

          Perhaps it would be better to say
          >Nvidia can't without significant investments and development, which they don't appear to be doing.
          Getting to a point where Nvidia has an off-brand AMD64 CPU that can compete with AMD's own hardware is going to be extremely difficult.
          Not even Intel can make a good knock-off chip these days and is essentially coasting on their past success and long term OEM contracts.

          Because in the high performance GNU/Linux computing market CPU architecture basically doesn't matter, Intel still has Windows OEMs by the balls, so an Nvidia AMD64 chip would basically just be for PC gaming, which makes it a bad investment.
          Sure they could emulate, but that is going to come at the expense of both performance and power, neither of which will go over well with gamers.

          • 5 months ago
            Anonymous

            I wonder where they're gonna get investments
            https://www.reuters.com/technology/nvidia-sets-eye-1-trillion-market-value-2023-05-30/

          • 5 months ago
            Anonymous

            Nvidia can't make x86 cpus. No amount of money will change this as Intel doesn't license out x86 and AMD can't license out AMD64 as it relies on intel's x86. ARM is a different matter.

      • 5 months ago
        Anonymous

        You're overreacting. This news only means that Nvidia's main focus is shifting to AI. Doesn't mean they're abandoning GPUs.

        • 5 months ago
          Anonymous

          No I just have a longer perspective than you kids. Discrete GPUs are going to become more expensive and more exclusive. We are really only a generation or two away from the RTX **60 hardware getting completely priced out by integrated solutions. When that happens Nvidia will find themselves without mainstream hardware, and that will drive their prices up even more, and could lead them to exit the consumer market.

          • 5 months ago
            Anonymous

            >Discrete GPUs are going to become more expensive and more exclusive. We are really only a generation or two away from the RTX **60 hardware getting completely priced out by integrated solutions.
            You idiots were saying this when the first APUs debuted. There will be a 5090, a 6090 and a 7090 and I will upgrade to each of them from my current 4090.
            You will never see an APU with big GPU outside of conslows.

            • 5 months ago
              Anonymous

              There almost certainly will, but as the low-end continues to be eroded Nvidia's top end hardware will continue to be moving up in price.

              >Nvidia can't make a product that can answer AMD's new AI hardware

              We'll see

              If they do manage it, it would be bad news for PC gamers. They would go the same path AMD are where they sell every high-end board to their AI customers, and don't bother with a really high-end board for PC gaming. Even AMD's top-end cards now are all just cut-down UNIX workstation hardware.

              • 5 months ago
                Anonymous

                There was nothing to stop it from happening in the past 10 years why do you believe it will happen now? There are no new incentives to do it that weren't there.
                Top end hardware keeps selling well. 4090 and 3090 sold better than 2080 Ti. People were saying the same thing about $1000 phones but expensive phones have seen an increase in volume despite slumping overall sales due to market saturation.

              • 5 months ago
                Anonymous

                Because while Nvidia can absorb the losses of hardware sales to APUs up to a point, they will eventually to find that they don't have enough consumer hardware sales of low-end hardware to justify the costs of driver development even to justify sales of high-end hardware to consumers.
                Basically exactly where AMD found themselves about 10 years ago.

                Whether or not Nvidia can successfully leverage openness to lower their driver costs and remain relevant has yet to be seen.
                Phones benefit from the fashion accessory effect. Discrete GPUs sit inside a case usually chosen for being cheap and gaudy.

          • 5 months ago
            Anonymous

            The new apus with 780M graphics are already better than 1060 3GB.
            But amd is always aiming at just slightly bellow their current lowest discrete gpu, thats now the rx6400 shit.

          • 5 months ago
            Anonymous

            >meanwhile, in the real world, the best integrated graphics on the market is still significantly worse than a fricking 750-ti

            • 5 months ago
              Anonymous

              Except that the best integrated graphics on the market is the MI300a which is practically the highest end thing you could buy.
              Also the 750ti is pre-Vulkan hardware, there is a lot of even low-end integrated GPUs that outclass it given the high-performance drivers standard on modern OSes.

              • 5 months ago
                Anonymous

                Wha? No! 750ti isn't pre-Vulkan, it's capable of running it just fine for all Vulkan versions it's compatible with.

              • 5 months ago
                Anonymous

                And where can you buy that?

        • 5 months ago
          Anonymous

          It also means that ai companies are getting top quality silicon while the gaymers that are dumb enough to buy nvidia shit will get cutdowns and rejects with botched cores and memory controllers.
          >why is my 4090 using 500W and producing more heat than the quadro 6000
          Because you are a cuckold.

          • 5 months ago
            Anonymous

            That is already happening. 4090 are the cut down rejects of Nvidia's datacenter hardware.
            AMD has similar products, but notably even their cut-down hardware for that market is still sold to that market.
            Nvidia can't make a product that can answer AMD's new AI hardware.

            I did just see a story that Nvidia and Intel are once again talking about getting into bed together. So that is still a possibility.
            Intel has been trying to take Nvidia over for years.

            • 5 months ago
              Anonymous

              AMD has chiplet design so its more suitable for cutting down.

            • 5 months ago
              Anonymous

              >Nvidia can't make a product that can answer AMD's new AI hardware

              We'll see

    • 5 months ago
      Anonymous

      People said the same shit about Creative Technology when they became more interested in stereos before they just stopped making EAX-capable Sound Blasters.

      • 5 months ago
        Anonymous

        I'm sure the leader in AI that has been focused on it for the past 15 or so years is going to drop out 🙂

  19. 5 months ago
    Anonymous

    They want total control of AI

  20. 5 months ago
    Anonymous

    I fricking knew it
    I've been saying for a while now that the 40 series is the last true "gaming" GPU.
    50 series and beyond will be tailored for AI workstations and the price will jump to an absurd degree. I don't doubt for a minute that a 5090 (or equivilent) will be 3,500$ or more because it isn't meant for consumer use, only AI/workstations.

  21. 5 months ago
    Anonymous

    So I'm suposed to just wait for AMD to be competitive in shit like Blender? That was the only reason I went with an Nvida over the cheaper AMD option

  22. 5 months ago
    Anonymous

    Makes sense. AMD can't compete for shit. The only reason RDNA2 was competitive was because Nvidia used Samsung's process, which was hot garbage.

  23. 5 months ago
    Anonymous

    Sad. Still never buying the 40xx series though.
    I'm skipping intel next major build too. Don't like the IME backdoor.
    I don't expect AMD would be much better though.

  24. 5 months ago
    Anonymous

    What does this means for gaming-oriented GPUs!? It's all up to AyyMD and Intlel?

    • 5 months ago
      Anonymous

      means that nvidia can now slap a $10000 price tag on a GPU with the excuse of being 'AI centered' and get away with it, with piss poor support for future rasterized games aside from DLSS and RTX

      • 5 months ago
        Anonymous

        >with piss poor support for future rasterized games aside from DLSS and RTX
        Good. By the time PS6 and Xbox Whatever launch we should have mandatory exclusively ray traced lighting in games. But raster is never going away because it's just a fancy way of telling a modern compute shader what to do. You will always have more complex compute to do outside of matrices for AI so you will always have classic compute and thus raster.

        • 5 months ago
          Anonymous

          Delusion.
          Gaytracing will remain a tacked on gimmick and games will just use some "lite" version of it.

          • 5 months ago
            Anonymous

            This the reality about raytracing. Most demos still involve the same white room with spheres because raytracing polygon meshes is still an expensive process to do real time. Most of these raytracing tech demo's can run on decade old hardware with some modifications.

          • 5 months ago
            Anonymous

            Most new games are coming with out with RT. Unreal has RT, Snowdrop has RT, Northlight has RT, Red Engine has RT (RIP), Frostbite has RT, etc. And they've had RT since before RT capable conslows came out. Before the end of this console generation, most technologically advanced games will have a PT mode like Metro Exodus, Cyberpunk 2077, and Alan Wake II. You can bet your ass that the next game from 4A will have PT

            • 5 months ago
              Anonymous

              Killzone Shadow Fall had ray tracing. That's a PS4 game btw.

              https://www.eurogamer.net/digitalfoundry-the-making-of-killzone-shadow-fall

  25. 5 months ago
    Anonymous

    GOOD,
    video games are dead anyway

  26. 5 months ago
    Anonymous

    who cares? there hasn't been a reason to buy anything except AMD since they released big navi

    • 5 months ago
      Anonymous

      Big Navi was finally competitive with Nvidia and people were buying them but AMD shat the bed hard with RDNA3 because it's an architecture based around margins not performance.
      They knew Nvidia was going back to TSMC, instead of chiplet meme they should've made a 160 CU beast and doubled down on adding more L4 instead of reducing it like they did with RDNA3

      • 5 months ago
        Anonymous

        Still it seems like even if it is a somewhat disappointing follow up to RDNA2, those two architectures are AMD's most successful consumer hardware in a while.
        What you're asking for sounds exactly like what AMD is doing in the AI market. The reason they aren't selling it to gamers is that UNIX Rocm AI Workstation users are willing to pay a lot more for every AMD board they can get their hands on.

  27. 5 months ago
    Anonymous

    It's only a matter of time before some fricks manage to make a streaming service that cracks using DLSS on literally everything

    Render a game, send the feed and necessary depth/motion info compressed with MUH AI, and then an AI capable processor just upscales and smooths the feed on your end. Reduces latency by sending a smaller feed, reduces server rack load, and theoretically reduces perceptible compression into a smudged AI mess

    • 5 months ago
      Anonymous

      you don't need to do that lmao
      tensor cores aren't special, all you need to do is extract the DLSS training data from the NVIDIA driver and make it run on compute shaders
      it might run slower on amd/intel but there's no reason why it shouldn't work

    • 5 months ago
      Anonymous

      Nobody will pay Nvidia for DLSS on a streaming service. So unless Nvidia hosts that themselves, or gives somebody a real discount to use it that won't happen.
      You can do the exact same thing with FSR3, and AMD already has a bit of experience with it since Stadia ran on their hardware, and they have the advantage that FSR will run on any capable hardware. No need for the user on the receiving end to have Nvidia hardware.
      Stadia actually worked surprisingly well, the issue was games need to be developed around the limitations of streaming, and they need to be bundled under a single GAAS model. Google wasn't willing to really commit the billions of dollars they would have needed to make it work.
      Whether Microsoft or anyone else can any time soon remains to be seen.

      Ironically Nintendo might have an advantage because their games are smaller, they use relatively standard technology, and their policy of using 'withered' hardware lowers the requirements.
      Like I wouldn't be surprised to see somebody demo a web delivered Switch game, where it is delivered via the browser, but runs on the local device using standard APIs.
      It is the ultimate end-run around Apple's walled garden, unless they kill web browsers.

  28. 5 months ago
    Anonymous

    Basically AMD and Intel won.

  29. 5 months ago
    Anonymous

    Said email was sent 10 years ago

    https://old.reddit.com/r/nvidia/comments/18c7mkb/nvidia_is_no_longer_a_graphics_company/kc8szjg/

  30. 5 months ago
    Anonymous

    Guess there will be no escaping the whole AI thing in the future for games, huh?

  31. 5 months ago
    Anonymous

    >"saying the quiet part out loud"
    credulity discarded

    • 5 months ago
      Anonymous

      jej

  32. 5 months ago
    Anonymous

    So if AMD exits the market too, are video games just over?

    • 5 months ago
      Anonymous

      No, because even if AMD stops selling GPU hardware to Windows users they'll still have high-end hardware for UNIX workstations, and low-end SOCs that run with the same open-source drivers which are the best option for gaming.
      The same thing might be true for recent Nvidia hardware as well, but they won't have the low-end hardware.

      Basically as with a lot of markets the low-end becomes capable enough for the majority of people, that makes the middle collapse, and you're left with a market where everything is basically mainstream low-end, ultra expensive high-end, and then a market for people who like getting their hands dirty clawing out a more premium experience on the cheap.

  33. 5 months ago
    Anonymous

    God I hope they totally fumble it and Intel and AMD somehow come ahead as the only viable GPU solutions. That would be my biggest dream. I hated this company since the 460 GTX died 3 times in the span on 1 year.

  34. 5 months ago
    Anonymous

    >There’s also AMD, which takes a back seat to Nvidia’s high-end GPUs, but is still its fiercest competitor. Huang says that’s not the case, telling The New Yorker that “we’re not very competitive.” The article points out that Nvidia employees can pull the relative market share of Nvidia and AMD graphics cards out from memory.

    your dogshit company ruins several industries with malicious practices and you have the audacity to trash talk? i hope EU regulations eat you alive and you get raided again, jensen.

    • 5 months ago
      Anonymous

      >ruins several industries with malicious practices
      You imply that everyone plays fair, which is not the case in the real world. Every other company is always trying to ruin your business via legal means. If you want to succeed you have to be aggressive.

      • 5 months ago
        Anonymous

        i didn't really imply that, but the baseline for running a company is not breaking the law. how many lawsuits, domestically and internationally. has nvidia swept under the rug now?

    • 5 months ago
      Anonymous

      >your dogshit company ruins several industries
      Seems like a lot of anons in these threads can't handle success

      • 5 months ago
        Anonymous

        i didn't really imply that, but the baseline for running a company is not breaking the law. how many lawsuits, domestically and internationally. has nvidia swept under the rug now?

        You don't win by playing by the books.

        • 5 months ago
          Anonymous

          take a hike, israelite. subhumans like you are the reason consumers get raped by every company.

          • 5 months ago
            Anonymous

            I'm just stating facts, why are you so mad? Nvidia cheats and plays dirty, but they are the best at it, their 4090 is unmatched, expensive, but unmatched.
            If you wish Intel and AMD to be relevant, then create a card as powerful as a 4090 and sell it at an affordable price, or become Nvidia 2: Red/Blue edition

            • 5 months ago
              Anonymous

              >why are you so mad?
              because i'm forced to live in the wreckage you knuckle-dragging chimps leave behind
              >Nvidia cheats and plays dirty
              because you let them

              • 5 months ago
                Anonymous

                So I'm assuming your very next GPU purchase will 100% be Intel right? you won't choose AMD at all, otherwise I'll call you a hypocrite.

              • 5 months ago
                Anonymous

                i haven't been team green since the gtx780 when geforce experience required a login to use functionality i already paid for. team blue burned me by re-releasing haswell like 20 fricking times. i know it's not the best hardware but at least i have principals.

              • 5 months ago
                Anonymous

                Intel makes GPUs now, they are far cheaper and perform similar. If you had principles, you would go for the newcomer, but I know you will stick to AMD, so you really have no reason to blame nvidia users for doing the same.
                Also a using a 10 year old experience as the current representation of a product doesn't make sense, it's like complaining that my old Sony cassette erased some data years ago so I'm never using anything made by Sony again. It's kinda dumb to be honest.

              • 5 months ago
                Anonymous

                Not him but I did buy an arc gpu to give intel a chance but internally it looks like intel isn't.

              • 5 months ago
                Anonymous

                >It's kinda dumb to be honest.
                I had an FX 6350. If I acted like that anon, I wouldn't have an R5 5600X now

              • 5 months ago
                Anonymous

                >far cheaper and perform similar
                that's just not true.
                >If you had principles, you would go for the newcomer
                why should i support a more crooked and worse company? are you serious? i picked the lesser evil and it still the lesser evil today.
                >It's kinda dumb to be honest.
                tell me you don't need to login to the nvidia software suit or that tessellation is retroactively fix on older geforce partner games. how about the shitshow that's g-sync? is the 4080 faster than the 3080ti yet? have nvidia and intel stopped blatantly lying in product showcases yet? your whole argument is dumb, you are a dumb and bad consumer.

              • 5 months ago
                Anonymous

                >I picked the lesser evil
                I accept your concession then.

              • 5 months ago
                Anonymous

                what concession, schizo? you assumed something that i already corrected you over, i never implied amd will take a proverbial bullet for you either. i still live in the wreckage of reckless, braindead consumerism thanks to you and people like you.

              • 5 months ago
                Anonymous

                >thanks to you and people like you.
                That's your excuse to fail, not mine. The strongest survives, you have to cheat and get the head start always. The good guy always loses in the real world and specially in the business world, over there people don't fight or insult each other, they weasel their way around and frick you over legally.
                The only reason AMD is not trying to screw Nvidia over is because they are literally family and are still doing shady business in the background all the time, your choice of the lesser evil makes 0 sense because it all goes towards the same goal but instead of green is red, you are beyond delusional.

              • 5 months ago
                Anonymous

                >principals

            • 5 months ago
              Anonymous

              Expensive for the consumer market, but heavily discounted from what they would make in the data center market, where Nvidia is getting outmatched by AMD.
              The sad reality is that AMD has hardware that beats the 4090, they won't sell it to you because they make more money selling it to the high performance computing market.

              AMD is shut out of the Windows gaming market, but that doesn't mean they've given up, and the way the market is developing favors AMD.

              • 5 months ago
                Anonymous

                >in my hypothetical head canon AMD beats the frick out of Nvidia despite every report saying otherwise, just wait 2 more weeks!
                Easy on the copium anon.

              • 5 months ago
                Anonymous

                I don't know if you missed the news, AMD's compute hardware has a pretty significant lead on Nvidia, and they've been getting the big Supercomputer contracts lately.
                Like I said earlier, Nvidia is getting sqeezed at both ends of the market. They can't make a chip to get into the new low end of the PC gaming market, and they can't make a chip to compete at the new high-end of the compute market.

                Barring the Intel takeover of Nvidia finally happening, it seems like Nvidia may be on its way out. It isn't really even clear what Intel would want from Nvidia at this point. Maybe Trademarks? Some OEM deals they'd be basically buying Nvidia out of?

              • 5 months ago
                Anonymous

                Are you moronic? Nvidia absolutely dominates the AI-sphere. They sell every single thing they mint. rocm is a joke compared to cuda, the mi300x is outdated at launch, the h200 and b100 render it obsolete. Nobody cares about AMD trash because it's both underperforming and lacking cuda. cuda is deeply embedded into AI, you'd be actively sabotaging your startup if you went with AMD because literally everybody else already uses cuda

              • 5 months ago
                Anonymous

                >They sell every single thing they mint.
                If they were as dominant in the AI space, as you say why wouldn't they be selling those cards for $5000 to AI customers?
                Why are they selling 4090s to Windows users for a deep discount? You don't see AMD selling MI300x to PC gamers.

                >rocm is a joke compared to cuda,
                Yet, it seems to be winning lately in the data center market because even if it is harder to set up on Windows, serious business doesn't get done on Windows.

                >Y-you have to buy our hardware! Can't you see how dominant we are! Why aren't you paying the premium!
                MI300a goes whirrrrr

              • 5 months ago
                Anonymous

                That's currently very very speculative anon, Nvidia is yet to release their new product as well so they might go head to head in the DC world soon, but consumer wise there is just no comparison.

              • 5 months ago
                Anonymous

                >Why are they selling 4090s to Windows users for a deep discount?
                Yet it's the highest priced consumer GPU ever and still sells for well above MSRP. 3090/4090 serves as a gateway into the cuda ecosystem for AI startups. Elevenlabs, who made that voice synth AI back in Febuary, was done with about 12 3090s. Meanwhile AMD is only barely rolling out rocm to consumer GPUs. AMD is so far behind it's laughable. I don't know what 'data center world' you're talking about but pretty much every compute cluster uses nvidia. vast ai is 4090s all the way down, same with google cloud. You're lying through your teeth if you think anyone's out there racing to pick up AMD GPUs for AI compute.

              • 5 months ago
                Anonymous

                He's kinda right though.
                https://web.archive.org/web/20170321133446/http://store.steampowered.com/hwsurvey/
                Before the chink invasion, the NVIDIA to AMD GPU ratio was 3:1. Which skyrocketed to fricking 10:1 due to the way cybercafe works.
                https://web.archive.org/web/20171220010308/http://store.steampowered.com/hwsurvey/

        • 5 months ago
          Anonymous

          >obnoxious and awful dyke haircut
          my boner....

  35. 5 months ago
    Anonymous

    >Nvidia sabotages crypto miners with reduced hashrate cards
    >Nvidia doubles down on AI and rebrands into an AI company
    So where are you "AI is just a fad like crypto" homosexuals now? Have you slowly started to get your necks out of the sand and come to terms with the fact that AI is here to stay?

  36. 5 months ago
    Anonymous

    So buy Nvidia stock or nah?

  37. 5 months ago
    Anonymous

    Nowhere does it say or imply they’re going to stop making GPUs
    Is everyone on this board actually this stupid? What the frick?

    • 5 months ago
      Anonymous

      it's mostly Linux and AMD cucks jerking off to their imaginary scenarios. They often make discord raids and spam anti-windows/nvidia threads on Ganker

    • 5 months ago
      Anonymous

      ofc they dont, piggie, AI has to learn on piggies somehow

    • 5 months ago
      Anonymous

      It's clickb8. Don't you know you can't start a thread on vee without clickb8?

    • 5 months ago
      Anonymous

      honestly this is one of the worst threads I've seen the entire year. Probably the worst.

  38. 5 months ago
    Anonymous

    good. i hope the never come back. homosexual israelites

  39. 5 months ago
    Anonymous

    This is an out-of-season April Fools joke, right?

  40. 5 months ago
    Anonymous

    nvidia control panel my beloved

  41. 5 months ago
    Anonymous

    AMD won.

  42. 5 months ago
    Anonymous

    >pckeks getting cucked once again
    hope you're ready for the next crypto boom and what it means for your overpaid excel machines, mustards.

    • 5 months ago
      Anonymous

      now post your CPU, moron
      also you can mine on toster even

  43. 5 months ago
    Anonymous

    Surely AMD will fix their cards and drivers now that they have no competition left....right?

  44. 5 months ago
    Anonymous

    does that mean the neet at ?pcbg/ will finally leave?

  45. 5 months ago
    Anonymous

    >vice president of corporate marketing
    >corporate marketing
    >marketing
    so they are now marketing themselves as an AI company with nothing about what they do changing?

    • 5 months ago
      Anonymous

      correct

  46. 5 months ago
    Anonymous

    That's good for Intel I guess

  47. 5 months ago
    Anonymous

    does anyone know what the equivalent of the nvcleaninstaller for amd is and how to use it?

  48. 5 months ago
    Anonymous

    I hope someone makes quantum computing actually viable pretty soon. this should make anything nvidia can produce look like an abacus or a fisher price toy. then they can go back to making toys for us.

Your email address will not be published. Required fields are marked *