What happened to PhysX?

What happened to PhysX?

Unattended Children Pitbull Club Shirt $21.68

Ape Out Shirt $21.68

Unattended Children Pitbull Club Shirt $21.68

  1. 4 months ago
    Anonymous

    died a death because it was never well optimized or well implemented, fricking Black Flag, a decade old game, drops the fps all the way to 30 with that on and i have a 3080 ti.

    • 4 months ago
      Anonymous

      Having such game logic dependant calculations done off-CPU never made sense so optimisation or implementation makes no difference. In contrast separate processors for graphics works well because the communication is one way. That's the simplest way to think about it.

    • 4 months ago
      Anonymous

      sounds like raytracing

  2. 4 months ago
    Anonymous

    What happened to Gsync, and what is going to happen to DLSS. And CUDA, now that companies actually give a frick about GPGPU.

    • 4 months ago
      Anonymous

      >did what was supposed to do, sell more cards

      >Gsync
      Yeah anon, what happened? It just works
      >DLSS
      Upscalling is unironically fine, but I don't believe real gamers are up for fake frames
      >CUDA
      Isn't tensorflow written in CUDA and that's why AMD doesn't/didn't support it for a very long time? I'm 100% sure stable diffusion uses CUDA, because I've been playing with this

      • 4 months ago
        Anonymous

        Yeah, AI shit uses CUDA. Nvidia is doubling down on CUDA for that reason.

      • 4 months ago
        Anonymous

        >Yeah anon, what happened?
        What happened is that Freesync does the same thing, but everyone can use it. So Gsync is effectively dead, Nvidia could drop the brand and open-source the code like they did with PhysX and no one would give a shit.
        >DLSS
        Yes, DLSS is the best upscaler on the market. For now. At some point a free alternative will reach the same level of quality and no one will care about DLSS.
        >CUDA
        Because Nvidia played this game alone uncontested for the last 10 years. AMD was a zombie company not that long ago, and Intel didn't care about GPUs for whatever reason. Now with the AI boom, everyone is investing into free alternatives to CUDA because no one wants to be stuck with Nvidia (or any other company for that matter).

        At the end, open standards always win. But I'm sure Nvidia will come up with something else and the rest of the industry will chase them.

        • 4 months ago
          Anonymous

          >no one would give a shit.
          Nobody ever gave a shit. Nvidia paid them to implement it but when Nvidia moves to the next shiny thing somehow, someway these Nvidia technologies seems to die for no reason and are never brought up again.

        • 4 months ago
          Anonymous

          Real gsync was vastly superior to open source shit like freesync and hdmi vrr.
          No weird quirks and could actually sync the monitor's entire refresh rate range where vrr usually stops working properly at 48 or so

          • 4 months ago
            Anonymous

            Everyone uses gsync or freesync. You'd have to go out of your way to find a monitor that doesn't support it.

            Yeah, that's what I said. Nvidia introduced it, they leveraged their software advantage to sell more hardware, a few years later their competitors caught up. And now no one cares about Gsync, because Freesync works just as well but doesn't come with a massive premium.
            That's the fate of every single Nvidia technology, to be replaced by an open alternative. And when that happens, Nvidia introduces and hypes up some other thing.

      • 4 months ago
        Anonymous

        >but I don't believe real gamers are up for fake frames
        I am. Listen, for comp gays its cancer, but you can make games look like 120fps and feel like 60( a lot is based on perception so it feels smoother as well) at 50% less power draw.
        I have a 4090 and play at 1440p, and even then I put it on. I'm a huge AV gay and I cant really tell the difference mid gameplay.
        30 to 60 is another thing tho

    • 4 months ago
      Anonymous

      Everyone uses gsync or freesync. You'd have to go out of your way to find a monitor that doesn't support it.

  3. 4 months ago
    Anonymous

    it went open source and nvidia stopped paying developers to include it
    now the average video card supports physics calculation but games don't really use it

  4. 4 months ago
    Anonymous

    Vendor locked tech usually fades into obscurity

  5. 4 months ago
    Anonymous

    The same thing that happened to Hairworks, the same thing that happened to G Sync and the same thing that will happen to the RTX line and the same thing it will happen to every single thing Nvidia develops.

    • 4 months ago
      Anonymous

      It's a shame about Hairworks though. Hair in Witcher 3 with hairworks looked miles better than any other game before or since.

      • 4 months ago
        Anonymous
        • 4 months ago
          Anonymous

          are you moronic?

          • 4 months ago
            Anonymous

            Are you? The hairworks in witcher 3 are decent at best.

  6. 4 months ago
    Anonymous

    It's literally became the standard

    • 4 months ago
      Anonymous

      name 5 games released in the last 5 years with any noteworthy amount of physics.
      No rolling barrels don't count.
      Crysis somehow still has one of the most impressive physics environments

      • 4 months ago
        Anonymous

        ToTK

        • 4 months ago
          Anonymous

          The impressive part of TOTKs high interactability of objects diminishes with the fact that you can only act upon like 30 different dedicated objects and at the same time

      • 4 months ago
        Anonymous

        Anything 3D in Unity

  7. 4 months ago
    Anonymous

    same shit as all the other gimmicks like god rays, gameworks, hairworks, tessellation and ect.

    These days, it's Ray Tracing. Give it some years and they'll abandon it for the next big name gmmick.

  8. 4 months ago
    Anonymous

    >enable physx
    >game crashes
    nice technology

  9. 4 months ago
    Anonymous

    >What happened to PhysX?
    it's standard now, that's what happened to it
    goyvidia paved the path for it
    but don't expect this board to understand
    cpu's back in the day were way too weak to do what physx does, so the gpu was burdened with doing it
    now the cpus are strong enough, and physx is now used in most game engines as a baseline for physics engines

  10. 4 months ago
    Anonymous

    UE4 and Unity used physx as their primary physics engine for a decade. Unity still does. It didn't go anywhere. Devs just stopped using all the meme features that killed performance.

  11. 4 months ago
    Anonymous

    CPUs became powerful enough to run it, then they started artificially locking out features if you don't have an nvidia card, which practically killed any chance for wider adoption.

  12. 4 months ago
    Anonymous

    >What happened to PhysX?
    nothing, it's open source now and all unreal/unity games use it
    nvidia released the code for it back in the 2010s
    https://news.softpedia.com/news/NVIDIA-Releases-PhysX-Source-Code-Free-on-GitHub-475158.shtml
    https://www.guru3d.com/story/nvidia-launches-physx-sdk-4-0as-open-source-physics-engine
    i'm sure most other anons will tell you it was a scam though

    • 4 months ago
      Anonymous

      >i'm sure most other anons will tell you it was a scam though
      It wasn't a scam, the original version was just limited by the fact that it couldn't be gameplay altering otherwise a PhysX card would become mandatory and cut your potential market to almost nobody. So it was limited to fluff, better particles and such that didn't actually impact the player.
      nVidia bought them and instead of making it a standard that everyone could use but they were best at, they fricked it. They made licensing terms that were deliberately bullshit so that they could CLAIM it was available to all but nobody sane would agree to them, this meant you either needed an nVidia card or you were stuck with CPU. But they also deliberately refused to properly thread the CPU variant so it was unusably slow meaning again the only games that used it had to have a "PhysX off" mode so that they could hope to sell more than 10 copies.
      Now that the ship has sailed and Physics libraries are everywhere, nVidia sucked it up and released the code. Now we have CPU and alternate GPU versions that aren't awful and so now gamers don't give a shit who provides the Physics simulations because it's all the same in the end.
      *clap* *clap* nVidia.

  13. 4 months ago
    Anonymous

    It got merged into modern physics calculations. morons who said it died are unfortunately part of this board and why it's so complete shit.

    • 4 months ago
      Anonymous

      You know exactly what the OP means, which is why did the era of focussing on expensive Physx effects end

      It's a shame it was so short-lived after the Ageia acquisition. We had some extremely cool fabric and smoke effects in games like Metro, Mirror's Edge, Arkham Asylum etc then it just stopped

  14. 4 months ago
    Anonymous

    It went to the same place nvidia hairwork went, and where nvidia RTX is heading.

    • 4 months ago
      Anonymous

      >and where nvidia RTX is heading.
      So it's going to become the norm and simpletons a decade later will claim RTX died because it no longer has the nvidia logo over it?

      • 4 months ago
        Anonymous

        nah it'll fall to irrelevance because nobody uses it since developers will simply make their base games even less optimized since DLSS will save their bacon time after time.

  15. 4 months ago
    Anonymous

    >people bought cheapo nvidia cards just to accelerate physx
    >Nvidia put a blocker in their driver where if it detects AMD drivers installed physx will default to cpu fallback mode only

  16. 4 months ago
    Anonymous

    Physx built in game engine now.

    Unreal and Unity.

Your email address will not be published. Required fields are marked *