Aside from fps, is there any difference in quality of raytracing in Nvidia and AMD or is it the same(like they say that DLSS is better than FSR)?

  • Vinny_93@lemmy.world
    link
    fedilink
    English
    arrow-up
    33
    arrow-down
    1
    ·
    8 months ago

    DLSS works on Tensor cores only available to Nvidia. FSR works on anything. This means that DLSS is more specialised and, if implemented in a game properly, will work better.

    GSync only works on Nvidia cards and GSync monitors, whilst FreeSync works on FreeSync and GSync monitors with any gpu.

    Now ray tracing works on RT cores for Nvidia and I believe AMD have something similar. The key difference with the former technologies is that ray tracing doesn’t have an Nvidia or AMD version, the tech is part of the DirectX 12 Ultimate specification. (I think Vulkan has something similar). Both GPU makers use DX12 so they use the same software to apply ray tracing.

    The fact of the matter is that the RT cores of Nvidia are more effective than the ones AMD utilises. AMD usually combats that by just adding more cores.

    In the end, it all hangs on implementation. In some games, AMD will be better because the game devs have optimised it for AMD GPUs. In most games, Nvidia will be better. I suggest looking up benchmarks for games you play with and without ray tracing.

    • ninjan@lemmy.mildgrim.com
      link
      fedilink
      English
      arrow-up
      25
      ·
      8 months ago

      To clarify for the purpose of answering OPs question. The quality will be the same because it’s the same code in both cases. But the performance, as in how many FPS, you get will most often differ.

    • Justin@lemmy.dbzer0.com
      cake
      link
      fedilink
      English
      arrow-up
      2
      ·
      8 months ago

      GSync only works on Nvidia cards and GSync monitors

      I have a Freesync monitor (MSI) with an Nvidia RTX 3060, and Nvidia control panel gives me the option to “enable support for unverified displays” or something. Works just fine for me?

      • TheGrandNagus@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        ·
        edit-2
        8 months ago

        That’s FreeSync, although Nvidia, confusingly, calls it G-Sync, just like their other frame sync tech.

        G-Sync required an expensive module in the display, FreeSync doesn’t.

        Nvidia lost the G-Sync vs FreeSync battle, but because of their marketing chops, they managed to get away with just slapping their name on it and going with the open solution.

        DLSS has been much more successful, but it’d be like if they started using FSR, but rebranded it as DLSS.

      • Miss Brainfarts@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        6
        ·
        8 months ago

        That just means your Nvidia card can make use of a Freesync monitor, there’s no „real“ Gsync happening there.

        Actual Gsync comes with a dedicated hardware module in the monitor, and it used to be only compatible with Nvidia cards, but that’s also not the case anymore.

        • Justin@lemmy.dbzer0.com
          cake
          link
          fedilink
          English
          arrow-up
          4
          ·
          8 months ago

          So how does it work? Is it a fake software level that mimics G-Sync behavior? Something like V-Sync?

          • Miss Brainfarts@lemmy.blahaj.zone
            link
            fedilink
            English
            arrow-up
            3
            ·
            edit-2
            8 months ago

            It’s pretty much just leveraging the open VESA Adaptive Sync standard, which AMD Freesync is practically speaking a rebrand of. It’s indeed purely software to make it vendor-agnostic.

            Well, unless the vendor locks it down/blocks it on purpose, which is what Nvidia has done up until… whenever Gsync Compatible became a thing.
            (Misleading name imo, because as said before, there’s no actual Gsync running)

    • Lojcs@lemm.ee
      link
      fedilink
      English
      arrow-up
      1
      ·
      8 months ago

      Couldn’t there be a difference between demonising algorithms if those are baked in the drivers?

          • Vinny_93@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            8 months ago

            Most likely. Bottom line it’s a total package kinda deal. If it were just one or two components, it’d be pretty easy to improve. It’s the synergy between the graphics API, the implementation in the game, the GPU, GPU driver, the CPU, motherboard chipset etc. All working together.