• thatonecoder@lemmy.ca
    link
    fedilink
    English
    arrow-up
    8
    ·
    5 hours ago

    The problem is that it’s being used to not optimize, when it should be to prolong the lifespan of computers, mostly older gaming rigs. If developers focused on optimizing and not on rushing things, a GTX 1080 Ti could probably handle AAA games at 1440p, high settings, at least at 60 FPS, and 140+ FPS with DLSS at quality. Keep in mind that I don’t blame most developers, but rather big corps, that do have partnerships with companies like Nvidia, that obviously want people constantly buying their GPUs.

    • Overspark@feddit.nl
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      2 hours ago

      GTX cards don’t have the hardware to do DLSS though, so unfortunately this is impossible.

      • thatonecoder@lemmy.ca
        link
        fedilink
        English
        arrow-up
        1
        ·
        37 minutes ago

        Nonetheless, I think that it is possible to modificate these cards, to have an upscaling chip inside it. But it would take some effort, which no company will ever do.

      • 🇰 🌀 🇱 🇦 🇳 🇦 🇰 🇮 @pawb.social
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        1 hour ago

        I was gonna say my 1660 Super is still able to do that in most modern games without DLSS (or FSR). In fact, most of the time turning on the AI upscaling makes things run worse and I don’t even understand that. But like, two games that release in the same month and one runs great maxed out while another putters along at 30-40 on low settings with the upscaling off, despite both being on the same engine, tells me that one of them is using DLSS/FSR as a crutch.

    • Fubarberry@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      2
      ·
      4 hours ago

      Yeah, I really like DLSS/FSR/etc for letting newer games run on old systems. But I don’t feel like it should ever be necessary for modern hardware to run it well.

      Ray tracing in general is a big culprit in this, it has such a high performance hit. That was fine back when Ray tracing was optional, but we’re increasingly seeing games with mandatory ray tracing now. Indiana Jones and the upcoming Doom The Dark Ages requiring it for lighting is a mistake imo, not something that computer hardware in general is really ready to be a default.

      • thatonecoder@lemmy.ca
        link
        fedilink
        English
        arrow-up
        1
        ·
        40 minutes ago

        Ray Tracing is useless (unless it’s for animated movies or movies that use CGI), regular lighting is a lot better for performance, and it’s 80% as good as Ray Tracing, in comparison. I use a really bad laptop, yet it is possible to get 30 to 60 FPS, on decently optimized games.