• laenurd@lemmy.lemist.de
    link
    fedilink
    arrow-up
    22
    arrow-down
    2
    ·
    1 year ago

    “Retard” who bought Nvidia here.

    I know it’s 4chan banter and generally agree with anons points, but here goes:

    • ROCM wasn’t a thing when I bought. You need(ed) NVidia for machine learning and other GPGPU stuff
    • I have yet to hear from anyone with an 8GB card who maxes out that memory on current-gen games at 1080p
    • apart from frame generation, you DO get DLSS 3 features on 3000 series cards
    • PrivateNoob@sopuli.xyz
      link
      fedilink
      arrow-up
      10
      arrow-down
      2
      ·
      1 year ago

      “Based” who bought AMD here.

      ROCM is still in it’s infancy stage. Literally ROCM isn’t supported for my 6700 XT, so I had to return to Google Colab to work on my AI thesis project.

      • gaiussabinus@lemmy.world
        link
        fedilink
        arrow-up
        4
        ·
        1 year ago

        ROCM support makes me angry but NVidia also fumbled their drivers too. Their is no good option so pick your poison. I run ROCM right now with a work around on my 6900 XT to get the card detected. And i have also gone from 10 It/s to 4 or even 2 with updates. Shit sucks.

    • AlmightySnoo 🐢🇮🇱🇺🇦@lemmy.world
      link
      fedilink
      arrow-up
      5
      ·
      edit-2
      1 year ago

      ROCM wasn’t a thing when I bought. You need(ed) NVidia for machine learning and other GPGPU stuff

      Same for me, had to buy an Alienware laptop with an NVIDIA GPU during my PhD for some GPGPU coding I had to do as CUDA was pretty much the only choice back then and OpenCL was a joke in terms of performance and wasn’t getting much love from GPU manufacturers. But right now, I know for sure I won’t ever buy an NVIDIA GPU again, ROCm works wonderfully well even on an APU (in my case, a Radeon 680M integrated GPU) and it’s also future-proof since you’re almost writing CUDA code so if you ever switch to an NVIDIA GPU again you mostly will just have to replace “hip” with “cuda” in your code + some magic constants (warp length in particular).

    • RealFknNito@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      8
      ·
      1 year ago

      Don’t put it in quotes. You know what you did.

      8GB of VRAM is fine for right now. Obviously nobody is complaining at 1080p but not a lot of people buying top of the line GPUs for a standard 1080p. I run 1440p and I know a good chunk of people who are hitting FPS limits of 240 that are now upgrading resolutions. 8GB of VRAM is like 16GB of RAM. Yes, most modern titles will work flawlessly but that isn’t going to be the case in the very near future especially if you plan of upgrading resolution.

      DLSS isn’t something to scoff at, I’ll give green credit there, but relying on it to make up for the shortcomings seems like cope. They made a stupid call in order to slash prices and you’re going to feel it later even with DLSS.

      If you need Nvidia for niche applications to do work and make money, get that bread king, but if you’re just a gaming enthusiast there’s no reason to subject yourself to a company that fucking hates you.

  • CheeseNoodle@lemmy.world
    link
    fedilink
    arrow-up
    6
    ·
    1 year ago

    “Not get dlss3.0 features on your 30 series”
    But DLSS 3.0 features do work on the 30 series? and the 20 series. The only thing locked out is frame gen.

  • boletus@sh.itjust.works
    link
    fedilink
    arrow-up
    3
    ·
    1 year ago

    Buy whatever card you need for your use case. Both are fine.

    For me, as a gamer, I think dlss is good shit, and nothing really beats it Rn. Also I like using rtx in single player games, I only expect 60-90fps from games anyway.

    I’m a game developer, I benefit by using a nvidia card because i have greater access to current standard apis and graphics features, hardware acceleration for light baking, the option to use tensor cores for learning how to write shit for it, and it generally has better compatibility with dev tools.

    Nvidia cards also tend to keep their value more, at least down under.

  • RealFknNito@lemmy.world
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    12
    ·
    1 year ago

    I get the people who buy Nvidia because of the niche use case of AI or they have to use specific software for their work. I also get the people who are unable or unwilling to learn enough about computers to give a shit who makes their GPU. Those people get a pass. For now.

    However anyone who willingly chooses to support the PhysX Gsync Hairworks Teraflop proprietary cluster fuck palooza do not. It makes me physically ill any time I see them flop their limp dick on the table and exclaim they paid for their dogshit tools to be in yet another game so they can make the claim their overpriced bullshit is worth it because it technically works 6% better on that game you want. You don’t give a shit why it does but it does. The walled garden quickly becomes a prison.

    So yeah I buy AMD, viva la revolution, fuck anticonsumer bullshit.

    • Polar@lemmy.ca
      link
      fedilink
      arrow-up
      6
      arrow-down
      4
      ·
      1 year ago

      I want the best bang for my buck. I don’t want old ass shitty FSR or AMDs falling behind Ray Tracing. Also RTX Audio is crazy. Also Nvidia shadowplay, which has been around forever, is so handy.

      When AMD catches up, I’ll switch. I’m not brand loyal. I just want to play games and use the latest technology.

      • Dudewitbow@lemmy.ml
        link
        fedilink
        arrow-up
        9
        ·
        edit-2
        1 year ago

        Best bang for buck and caring about exclusive features are counter intuative. The brand with more features always cost more, and always try to set the market in their favor to prevent competitors from gaining ground, even of they have or do not have a better product at a given price segment.

        E.g anyone who bought a 3050 for 30% more than a 6600 when the latter was by default, 15%+ faster on average in a performance tier that raytracing is usually not viable.

      • RealFknNito@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        3
        ·
        1 year ago

        Why care about Ray tracing? I get that it looks pretty for cinematic shots but for actual games it tanks performance so hard I’d literally never turn it on. Plus with UE5 coming with Lumen it doesn’t even seem like it can compare but I digress.

        I’m not brand loyal either, just hate anti consumer shit. Apple does it too. But suggesting Nvidia is giving you more bang for your buck… No sir or madam, that’s absolutely not the case. AMD is and has always been the price to performance solution.

        Nvidia will edge out top top performance, sure, but you’re overpaying by a lot. Some people have that kind of money but I sure as shit don’t. AMD have caught up, have made solid competitors to proprietary technologies, and you can even use them without an AMD card.

        AMD has in-house ‘moment capture’ like shadowplay but I’m unfamiliar with RTX audio. It won’t be a perfect switch obviously but if you do any poking around on value metrics for cards, you’ll see I’m not bullshitting you.

        • Polar@lemmy.ca
          link
          fedilink
          arrow-up
          5
          arrow-down
          2
          ·
          1 year ago

          Why care about Ray tracing? I get that it looks pretty for cinematic shots but for actual games it tanks performance so hard I’d literally never turn it on.

          I don’t know what games you’re playing, but when I turn ray tracing on, I go from 400fps to 120fps. Not sure why I’d ever want to disable ray tracing to get 400fps, when 120fps is plenty?

          But suggesting Nvidia is giving you more bang for your buck… No sir or madam, that’s absolutely not the case.

          AMD may be slightly cheaper, but you get worse ray tracing, FSR, no RTX audio, no good software like Shadowplay…

          All of those features are worth a bit more money. So yes, you get the best bang for your buck. Not sure why I’d want to “save” a bit of money if I am going to lose out on a ton of software features and a generational leap in ray tracing.

          • RealFknNito@lemmy.world
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            5
            ·
            edit-2
            1 year ago

            So you don’t realize how going from 400 to 120 might impact you if you start at say 150 instead? It’s still a massive performance hit. If Ray tracing is an essential feature to you, have at it, I just personally never cared so much about lighting that it was the sole or even main reason for getting a card.

            My brother you’re paying for features, yes, including ones you may not want or use. Nvidia is charging you for shadowplay in the cost of the card, AMD just includes a nameless feature to do the exact same thing in the settings. Everything you’re using has an AMD comparison or a FOSS alternative. You are not limited by the hardware, you just like the convience of Nvidia streamlining it but you should at least be aware and honest about that.

            You’re willing to pay extra for them to make you a nice dinner but most people cook their own food because they can’t afford to eat out every night. Yeah your burger isn’t going to be as good as they can make it but you’re getting a solid alternative for less. AMD gives you the best value, Nvidia gives you cutting edge, and if AMD keep making power moves like threadripper that might not always be the case.

            • Polar@lemmy.ca
              link
              fedilink
              arrow-up
              1
              arrow-down
              2
              ·
              1 year ago

              Shut the fuck up about FOSS. Lemmy users are insufferable.

              Enjoy your generation behind AMD card with less features for a few bucks less. I’ll keep my Nvidia card that’s generations ahead for a bit more money. Thanks.

              • RealFknNito@lemmy.world
                link
                fedilink
                English
                arrow-up
                4
                arrow-down
                3
                ·
                1 year ago

                Lmao keep being a slave to proprietary dogshit, its not generations ahead, and enjoy overpaying for less. I can only try to help with advice, can’t teach fools.

                • Polar@lemmy.ca
                  link
                  fedilink
                  arrow-up
                  1
                  arrow-down
                  2
                  ·
                  edit-2
                  1 year ago

                  AMD ray tracing and FSR are literally last gen…

                  Enjoy lying to yourself to feel better.

                  I don’t understand what you mean overpaying for less? I literally listed the stuff Nvidia has that AMD doesn’t. Sorry you can’t read. Can’t teach the illiterate.