• Davel23@fedia.io
        link
        fedilink
        arrow-up
        97
        arrow-down
        1
        ·
        10 days ago

        It’s replacing GeForce Experience. The nVidia Control Panel is still around.

        • subignition@fedia.io
          link
          fedilink
          arrow-up
          50
          ·
          10 days ago

          The cost of having to have an account to get “easy” driver updates always seemed a bit high to begin with. I never really found its game optimization profiles to be useful either.

          • Blackmist@feddit.uk
            link
            fedilink
            English
            arrow-up
            8
            ·
            10 days ago

            Yeah I disable those back when I noticed World of Warcraft started performing badly. GFE had helpfully optimised it to run at a resolution 4x higher than my screen and downscaled it…

          • PM_Your_Nudes_Please@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            ·
            10 days ago

            The profiles can be nice for setting most things, but having it default all of your games to Fullscreen instead of Borderless Windowed (and no way to change what the default setting is anywhere in the program) should be fucking criminalized.

        • bean@lemmy.world
          link
          fedilink
          English
          arrow-up
          23
          ·
          edit-2
          8 days ago

          They removed the forced login too. Which was welcome imho. It’s why I tolerate it now. Just for driver updates. I use none of the other features. Sometimes I wish stuff would stay in its lane.

          • Psythik@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            2
            ·
            8 days ago

            GFE’s only useful purpose is for ShadowPlay. Use Nvcleanstall instead to update your drivers. That way you can remove unnecessary features and stop the privacy-invading telemetry.

        • Blackmist@feddit.uk
          link
          fedilink
          English
          arrow-up
          20
          ·
          10 days ago

          GFE was terrible because it always forgot my login and fuck if I’m going to remember a password just to update drivers.

          At least they’ve done away with that bit.

          • Psythik@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            1
            ·
            edit-2
            8 days ago

            If updating drivers is the only thing you use GFE the nVidia app for, then why not use Nvcleanstall instead?

          • Pooptimist@lemmy.world
            link
            fedilink
            English
            arrow-up
            5
            ·
            10 days ago

            Serious question from someone who only recently moved to PC gaming: Why can it be ignored? Isn’t that where you get the latest drivers? Or are you downloading and installing them manually?

            • You can download them manually if you want. Updated drivers is rarely that important for performance. Maybe for newer games, but not for 98% of what’s already out there.

              And they also mess things up occasionally. Like all those Minecraft performance mods that had to change how the game looked to the driver, because if it looked like Minecraft it’d tune itself and get worse performance instead of better.

            • I Cast Fist@programming.dev
              link
              fedilink
              English
              arrow-up
              5
              ·
              10 days ago

              You don’t need to update your drivers every time a new version comes out, some games can actually get worse performance with a newer driver - I personally had problems with No Man’s Sky, nvidia drivers over version 424 I think, made the game effectively unplayable, while versions like 416 kept the game and the framerate smooth throughout.

            • 1Fuji2Taka3Nasubi@lemmy.zip
              link
              fedilink
              English
              arrow-up
              5
              arrow-down
              1
              ·
              10 days ago

              A driver allows games to interface with the graphics hardware, enabling accelerated performance for example. This “app” provides additional functionality on top of that (I don’t know what, but GeForce Experience it replaces provided things like recording gameplay videos etc.) which is not strictly required and, it seems, hurts gaming performance.

              As for getting the latest drivers, you can do it manually by going to nVidia’s website and download them, or rely on Windows update to give you reasonably recent drivers.

              • fuckwit_mcbumcrumble@lemmy.dbzer0.com
                link
                fedilink
                English
                arrow-up
                2
                ·
                8 days ago

                or rely on Windows update to give you reasonably recent drivers.

                Windows update: I see you just installed this driver from 3 weeks ago, let me just revert to a driver from 2021 for ya.

        • caut_R@lemmy.world
          link
          fedilink
          English
          arrow-up
          16
          ·
          10 days ago

          IIRC their plan is to get rid of the control panel once they‘ve carried all its functionality over to the app.

            • caut_R@lemmy.world
              link
              fedilink
              English
              arrow-up
              7
              ·
              10 days ago

              Tbh, the control panel is a lot of things, but responsive or slick aren‘t one of them. As long as they carry all the functionality over and get rid of the bugs, I‘m happy with the app. Unless they pull a fast one and add account requirements in again later.

              • subignition@fedia.io
                link
                fedilink
                arrow-up
                1
                arrow-down
                1
                ·
                9 days ago

                I mean, my point is there’s no reason they should be overhauling it entirely (at the cost of performance) when they could just pay some competent Windows programmers to un-shit the existing Control Panel. Yeah its UI sucks but it’s not going to make you drop frames for just having it open

                • caut_R@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  ·
                  9 days ago

                  IIRC the framework it‘s built on is so ancient it didn‘t allow for that, they needed to re-write the whole thing to „fix“ it, and this is what they came up with for that. DF‘s Alex said as much in one of their podcast episodes. All just paraphrased by me of course.

                  I don‘t think the performance hit is by design or intentional anyway, so hopefully the current screw-up is gonna be a nothing burger by the time the app‘s mandatory (if it ever will be).

        • simple@lemm.eeOP
          link
          fedilink
          English
          arrow-up
          7
          ·
          9 days ago

          Neither of them are as good, especially if you factor in raytracing. DLSS Ray Reconstruction is basically required to not have a noisy image with RTX.

          • count_dongulus@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            4 days ago

            When I went team red for the first time earlier this year, I really scrutinized zoomed in screenshots to compare the upscaling for FSR and DLSS. With FSR 3, I couldn’t see any difference compared to DLSS. Older FSR versions yeah, but at least for me not a problem any more.

          • potustheplant@feddit.nl
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            2
            ·
            9 days ago

            Ray tracing*

            RTX is a brand.

            Regardless, given the performance impact and how few games actually have ray tracing (implemented correctly), it makes more sense to just disregard ray tracing altoghether.

            It’s an undercooked technology used to push more expensive products, nothing more.

            Regarding dlss vs fsr and xess, yes dlss has better quality but it’s also proprietary so I honestly do not care about it. Just like gsync died, dlss will eventually die as well.

            • fuckwit_mcbumcrumble@lemmy.dbzer0.com
              link
              fedilink
              English
              arrow-up
              3
              ·
              9 days ago

              Just like gsync died

              (true) gsync isn’t dead, it’s only in the highest end of monitors which is basically where it’s always been. It only “died” because it requires an expensive module vs adaptive sync being built into basically every modern display controller so it’s basically free.

              • potustheplant@feddit.nl
                link
                fedilink
                English
                arrow-up
                2
                ·
                9 days ago

                The proprietary gsync approach with a dedicated hw module is indeed dead and most “g-sync” monitors just use the now pretty common vesa’s vrr (aka freesync).

                However I did research a bit and found some “gsync pulsar” monitors but none have been released yet, I believe. They do sound like unnecessary overpriced products though. That’s Nvidia for ya.

      • Zetta@mander.xyz
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        1
        ·
        10 days ago

        ROCM works mostly well in replacement of CUDA, and it gets better and better every year

    • Lemminary@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      9 days ago

      I thought I was happy I went AMD until my card started overrunning its fans for no reason a month after the warranty ran out. I manually had to reseat the card on the PCIe for it to stop because nothing else would, not even restarting the PC. And then one day it heated up so bad it stopped working. I think they gave me a defective card on purpose because people are less likely to return the items when they’re buying from outside the US.

      I’ve since gone back to Nvidia and my current card hasn’t given me any issues. What a nightmare that was.

      • OrderedChaos@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        8 days ago

        I swear that in my 20+ years of computer work that everyone has a story like this for every brand out there. It seems to literally be bad luck. That being said some companies just have abysmal and evil support ethics. And these days it seems all of them are trying to dial in the device failure to happen after the warranty expires.

          • OrderedChaos@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            8 days ago

            I think that can be true in many situations. I have had sincere failures that on the surface sound like incompetence. It is possible for things to fail so spectacularly it sounds like fiction.

  • Viri4thus@feddit.org
    link
    fedilink
    English
    arrow-up
    50
    arrow-down
    1
    ·
    10 days ago

    Getting ready to “motivate” people to get the 5xxx series because the current cards “have issues now”. The more you buy the more you save!

  • caut_R@lemmy.world
    link
    fedilink
    English
    arrow-up
    31
    ·
    edit-2
    10 days ago

    That‘s certainly something they‘re gonna want to fix. I hope DF and GN pick up on this, seems like free views and I‘d love to hear what they‘ve got to say on the matter.

    Edit: Also wondering if it‘s the app or if the performance hit disappears when you disable the overlay. Only flew over the article to see what games are affected how badly so mb if that’s mentioned.

    Edit 2:

    HUB‘s Tim tested it and found that it‘s the overlay or rather the game filter portion of the overlay causing the performance hit. You can disable this part of the overlay in the app‘s settings, or disable the overlay altogether.

    He also found that this feature wasn’t impacting performance on GeForce Experience, so it’s very likely a bug that’s gonna be fixed.

    To clarify: Using game filters actively can have an impact on either, but right now even when not actively using them, they cause a performance hit just by the functionality being enabled; a bug.

    The only outlier where just having the app installed hit performance was the Harry Potter game.

    • sp3ctr4l@lemmy.zip
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      8 days ago

      Here’s the quote, for people allergic to reading the update in the article.

      Update: Nvidia sent us a statement: “We are aware of a reported performance issue related to Game Filters and are actively looking into it. You can turn off Game Filters from the NVIDIA App Settings > Features > Overlay > Game Filters and Photo Mode, and then relaunch your game.”

      We have tested this and confirmed that disabling the Game Filters and Photo Mode does indeed work. The problem appears to stem from the filters causing a performance loss, even when they’re not being actively used. (With GeForce Experience, if you didn’t have any game filters enabled, it didn’t affect performance.) So, if you’re only after the video capture features or game optimizations offered by the Nvidia App, you can get ‘normal’ performance by disabling the filters and photo modes.

      So, TomsHW (is at least claiming that they) did indeed test this, and found that its the filters and photo mode causing the performance hit.

      Still a pretty stupid problem to have, considering the old filters did not cause this problem, but at least there’s a workaround.

      … I’m curious if this new settings app even exists, or has been tested on linux.

    • eramseth@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      10
      ·
      10 days ago

      Yeah they didn’t test that. Nor did they test having the app installed but not running. Crummy article tbh.

      • ArbiterXero@lemmy.world
        link
        fedilink
        English
        arrow-up
        13
        ·
        10 days ago

        Disagree, and i don’t think it’s the point.

        As an average user, why am I paying a performance hit for nvidia’s own “recommended parameters”

        That’s trash and a terrible experience, and they should be called out for it.

        • eramseth@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          ·
          10 days ago

          I don’t think you’re understanding. The testing they did was presumably fine and the performance hit is probably unacceptable. But mentioning but not testing the scenarios of

          • app installed but not running
          • app installed and running but overlay turned off

          Is kinda mailing it in.

          • ArbiterXero@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            ·
            10 days ago

            I’ll give you that, yep, sure.

            But that doesn’t invalidate the data they did get, it’s just not a full picture.

    • Vik@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      ·
      edit-2
      10 days ago

      Yep, uses CEF, though many popular desktop apps do without much perf impact.

      • rdri@lemmy.world
        link
        fedilink
        English
        arrow-up
        23
        arrow-down
        1
        ·
        10 days ago

        It’s not CEF that does most of the impact. It’s the contents web devs make it load and process. And web devs generally not being very competent in optimizing is just a sad reality.

        • merthyr1831@lemmy.ml
          link
          fedilink
          English
          arrow-up
          1
          ·
          8 days ago

          web Devs aren’t ignorant to optimizing but the kind of interfaces used in web are very different to that of desktop. Cross platform technologies can work, but anything built on top of web engines is going to be a little dogshit on native platforms.

          Web tech was designed around the asynchronous and comparatively slow nature of the network. Now, those same layout and rendering engines are being shoehorned into an environment where the “server” is your local disk so it’s suddenly doing a bunch of work that was intended to be done iteratively.

          Same goes the other way of course. Software designed for “native first” experiences like Flutter aren’t as popular in web dev because they work on that same, but reversed, assumption of a local disk being your source.

          It would be like wondering why physical game disks aren’t popular on PC - it’s a fundamentally different technology for fundamentally different expectations and needs.

          • rdri@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            8 days ago

            but anything built on top of web engines is going to be a little dogshit on native platforms.

            Hard disagree on “little”.

            Software designed for “native first” experiences like Flutter aren’t as popular in web dev because they work on that same, but reversed, assumption of a local disk being your source.

            Popularity should not be dictated by what web devs prefer. As long as they build for desktop, I won’t pardon excessive resource usage. And I’m not talking about Flutter. Better performance oriented frameworks exist, see sciter.

  • NoForwardslashS@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    10
    ·
    9 days ago

    Serious question: what is the benefit of Shadowplay now?

    I used to use it for all game recording, but Windows Game Bar and Steam have both implemented that functionality now.

      • FuryMaker@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        8 days ago

        Just out of interest, lower your mouse polling rate to see if it still happens.

        Not an ideal solution obviously.

        I used to have hitching like this.

        • glitches_brew@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          8 days ago

          I had initially lowered it a bit at some point. I didn’t realize it was steam recording for a while and spent a day or two trying driver updates and various things. next time I have a chance I’ll try a significant decrease just for testing.

    • Rai@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      2
      ·
      10 days ago

      How has it been such a pain? I haven’t even thought about my GPU once since I installed it… but I only use regular drivers.

  • merthyr1831@lemmy.ml
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    1
    ·
    8 days ago

    I know people complain about Nvidia and Linux but one of the best parts of my experience with it was never having to deal with GFE. Just a bunch of project managers trying to make themselves useful by shovelling needless slop into your GPU driver.

  • Katana314@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    ·
    10 days ago

    I used to only use this for game recording. But, it got a glitch where games record with a red tint ever since I upgraded my monitor. Thankfully, every single gaming helper app seems to feature recording now, so I just switched to another.

    • Appoxo@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      1
      ·
      9 days ago

      Sounds like something adjusted something in the nvidia control panel and the monitor is balancing that out with a low red value.
      Maybe worth to take a look.