I have this question. I see people, with some frequency, sugar coating the Nvidia GPU marriage with Linux. I get that if you already have a Nvidia GPU or you need CUDA or work with AI and want to use Linux that is possible. Nevertheless, this still a very questionable relationship.

Shouldn’t we be raising awareness about in case one plan to game titles that uses DX12? I mean 15% to 30% performance loss using Nvidia compared to Windows, over 5% to 15% and some times same performance or better using AMD isn’t something to be alerting others?

I know we wanna get more people on Linux, and NVIDIA’s getting better, but don’t we need some real talk about this? Or is there some secret plan to scare people away from Linux that I missed?

Am I misinformed? Is there some strong reason to buy a Nvidia GPU if your focus is gaming in Linux?

  • megopie@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    32
    ·
    11 hours ago

    I’d say in general, the advantages of Nvidia cards are fairly niche even on windows. Like, multi frame generation (fake frames) and upscaling are kind of questionable in terms of value add most of the time, and most people probably aren’t going to be doing any ML stuff on their computer.

    AMD in general offers better performance for the money, and that’s doubly so with Nvidia’s lackluster Linux support. AMD has put the work in to get their hardware running well on Linux, both in terms of work from their own team and being collaborative with the open source community.

    I can see why some people would choose Nvidia cards, but I think, even on windows, a lot of people who buy them probably would have been better off with AMD. And outside of some fringe edge cases, there is no good reason to choose them when building or buying a computer you intend to mainly run Linux on.

    • filister@lemmy.world
      link
      fedilink
      arrow-up
      5
      arrow-down
      3
      ·
      7 hours ago

      Even though I hate Nvidia, they have a couple of advantages:

      • CUDA
      • Productivity
      • Their cards retain higher resale values

      So if you need this card for productivity and not only gaming, Nvidia is probably better, if you buy second hand or strictly for gaming, AMD is better.

      • megopie@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        4
        ·
        edit-2
        6 hours ago

        It depends on the type of productivity TBH. Like, sure some productivity use cases need CUDA, but a lot of productivity use cases are just using the cards as graphics cards. The places where you need CUDA are real, but not ubiquitous.

        And “this is my personal computer I play games on, but also the computer I do work on, and that work needs CUDA specifically” is very much an edge case.

        • filister@lemmy.world
          link
          fedilink
          arrow-up
          1
          arrow-down
          1
          ·
          5 hours ago

          As far as I am aware they are also better at video encoding and if you want to use Blender or similar software, yes, it is niche, but a credible consideration. As always, it really depends on the use case.

          • reliv3@lemmy.world
            link
            fedilink
            arrow-up
            4
            ·
            edit-2
            5 hours ago

            Blender can be CUDA accelerated which does give Nvidia an edge over AMD. In terms of video encoding, both nvidia and AMD cards are AV1 capable, so they are on par for video encoding; unless a program does not support AV1, then the proprietary nvidia video encoders are better.

    • Mihies@programming.dev
      link
      fedilink
      arrow-up
      3
      arrow-down
      6
      ·
      edit-2
      4 hours ago

      From just hardware perspective, Nvidia cards are more energy efficient.

      Edit: I stand corrected, series 9070 is much more energy efficient.

      • iopq@lemmy.world
        link
        fedilink
        arrow-up
        8
        ·
        edit-2
        8 hours ago

        That’s not quite true. AMD cards just get clocked higher from the factory. So when a 9070xt beats a 5070 by an average of 17%, you can easily cap the power limit to match the performance. That’s with more VRAM which of course increases the power requirements

        The prices don’t quite match up, though since it’s between the 5070 and the ti (although in the US it’s often more expensive for some reason)

        The problem is that AMD is selling the chips to OEMs for a price that’s too high to enable to sell at MSRP while giving a discount for small batches of MSRP models. It becomes a lottery where the quickest people can get $600 models refreshing ever rarer restocks.

        One of the reasons is… tariffs, but I’m not sure how Nvidia got the prices down on its models