• Sunsofold@lemmings.world
    link
    fedilink
    English
    arrow-up
    7
    ·
    6 hours ago

    The NASA computers were among the most advanced computer science of their day. They were built by engineers with cutting edge technology. Chrome is a web browser, an absurd behemoth intended to view everything from a static page from twenty years ago to a dynamically assembled webapp using frameworks even the app’s creator doesn’t know one tenth of, but still has to import, and the whole thing is built to spy on what you do while you surf for cat pics and pussy pics for the ten trillionth time, feeding google’s monopoly.

    Not even apples to oranges. Apples to the lump formerly known as the planet Pluto.

  • Sam_Bass@lemmy.world
    link
    fedilink
    arrow-up
    11
    ·
    15 hours ago

    Back in 69 more people were carrying the load of logic in their heads. Its been a double edged sword of progress with more responsibilities offloaded to automation

  • UnderpantsWeevil@lemmy.world
    link
    fedilink
    English
    arrow-up
    86
    arrow-down
    3
    ·
    edit-2
    1 day ago

    STOP. DOING. UX.

    Computers were meant to do math, not make pictures.

    Trillions of pixels illuminated, but no real life benefit has been discovered.

    GUI. Ray Tracing. Generative Adversarial Networks and Diffusion Models.

    Terms dreamed up by the deranged.

    They are playing you for fools!

  • WanderingThoughts@europe.pub
    link
    fedilink
    arrow-up
    14
    ·
    1 day ago

    The guys that went to the moon were engineers and highly trained to use the computer. We can dream to have users half as competent.

    • chiliedogg@lemmy.world
      link
      fedilink
      arrow-up
      9
      ·
      1 day ago

      After separating from the Command Modupe for lunar descent, there was a faulty abort switch discovered on the Apollo 14 lunar module that required Alan Shephard and Edgar Mitchell to reprogram the lunar module computer in lunar orbit.

  • dan@upvote.au
    link
    fedilink
    arrow-up
    13
    arrow-down
    2
    ·
    1 day ago

    Unused RAM is wasted RAM. Apps like Chrome use available RAM if it’s available, but they should be releasing it for other apps to use when there’s high memory pressure.

    It’s the same with disk caching. If you have a lot of free RAM, the OS will use all of it for caching files.

    • MonkderVierte@lemmy.ml
      link
      fedilink
      arrow-up
      3
      ·
      edit-2
      5 hours ago

      No, the reason why browsers use so much RAM is because every tab is it’s own process and sandbox. That and lazy handling of content.

      Edit: apparently i overestimated the overhead of process & sanbox per tab? So it’s more lazy handling, i.e. keeping pictures in RAM instead of pushing them to cache?

      • dan@upvote.au
        link
        fedilink
        arrow-up
        2
        ·
        5 hours ago

        Sandboxing does use some RAM, but it was a big win for security. One site can’t crash the entire browser or use a security hole to get access to data on other tabs. Still, the majority of the RAM is taken by the site itself. The processes do share some RAM - they’re not entirely isolated.

    • The Quuuuuill@slrpnk.net
      link
      fedilink
      English
      arrow-up
      20
      ·
      1 day ago

      my problem with certain programs, chrome included, is they tell the os “no, you can’t have this ram back. i’m using it”

      i understand the logic of your argument, but it’s never played out in life

      • dan@upvote.au
        link
        fedilink
        arrow-up
        6
        ·
        edit-2
        1 day ago

        In some cases, the RAM actually is in use by the site. That’s especially the case on sites with heavy client-side logic. In that case, it’s not Chrome’s (or Firefox’s) fault, it’s the website’s fault. If you hover over the tab, it should show memory usage in the popover.

        Chrome has a “Memory Saver” feature where it’ll unload tabs that are offscreen/hidden which helps quite a bit. Not sure if Firefox has something similar.

  • 9point6@lemmy.world
    link
    fedilink
    arrow-up
    11
    ·
    edit-2
    1 day ago

    My desktop with 64gb sat idling with a web browser open:

    You got 32gb to play with chump

    • macniel@feddit.org
      link
      fedilink
      arrow-up
      6
      ·
      1 day ago

      You know those paper folders? Yeah imagine that each flap is actually a screen.

      You’re welcome.

    • JayDee@lemmy.sdf.org
      link
      fedilink
      arrow-up
      10
      ·
      1 day ago

      That’s not why we were able to get Apollo 11 onto the moon using only 8 kilobytes. The real reason is because we used the most batshit sorcery mankind may ever know to eek out every last ounce of usefulness we could muster from those 8 kilobytes.

      • Gladaed@feddit.org
        link
        fedilink
        arrow-up
        2
        arrow-down
        1
        ·
        15 hours ago

        There was some sorcery involved, yes. But that does not mean it wasn’t a fundamentally easy problem. Orbital mechanics are the easiest and cleanest physics around. That’s why classical mechanics was so incredibly useful: it’s a near perfect predictor for movement in the sky. There ain’t no friction no nothing. Just clean positions, gravity, and propulsion.

      • dan@upvote.au
        link
        fedilink
        arrow-up
        10
        ·
        1 day ago

        Games were impressive in this way too. Computers and consoles didn’t have much CPU power or memory, so they had to squeeze every little bit.

        This was still happening even with 5th gen consoles. Crash Bandicoot couldn’t fit in the Playstation’s memory so they ended up overwriting system memory and memory allocated to features of Sony’s standard library they weren’t using.

        These days, game development is more “boring” in that aspect. Systems are powerful and frameworks like Unreal Engine handle all the core stuff. That’s not necessarily a bad thing though - it lets the game developers focus on the game itself.

        • frezik@midwest.social
          link
          fedilink
          arrow-up
          1
          ·
          9 hours ago

          And they had bugs that were a direct result of limitations. The Minus World in Super Mario World, for example, comes from a combination of uninitialized values, how data structures are packed, and imperfect collision detection.

          People don’t talk about the problems that result from doing things that way.

          • dan@upvote.au
            link
            fedilink
            arrow-up
            1
            ·
            5 hours ago

            Most regular players didn’t encounter these bugs though, as often they’re edge cases that don’t occur during regular gameplay. A lot of them were found by people intentionally looking for them.

            I’d argue that games today are bugger than games in the past, just due to how complex they are now. Sure, they’re a different class of bug (and arbitrary code execution via buffer overflows isn’t really a thing any more thanks to ASLR and the NX bit), but I don’t think there’s fewer bugs at all.

            • frezik@midwest.social
              link
              fedilink
              arrow-up
              1
              ·
              5 hours ago

              If you’ve played SMB a fair amount, there is at least one that you’ve almost certainly ran into at random. It is exploited by speedrunners, but you’ve probably hit it just playing the game normally.

              Pirhana plants only check the hitbox every other frame. Obviously, this is a speed optimization. At some point, you’ve probably gone right through a piranha plant that should have hit you. Speedrunners can and do exploit this, as well, of course.

              An extension of this idea in other games is when you have split-screen multiplayer. In games like the OG Mario Kart, player inputs are processed on alternating frames. Which means the game has an average of 0.5 frames of input latency in multiplayer before anything else gets calculated in. (And people say retro games don’t have input lag on CRTs; these people are wrong for a lot of different reasons).

          • dan@upvote.au
            link
            fedilink
            arrow-up
            2
            ·
            5 hours ago

            I’m not a game developer so I just used the first example I could think of.

        • brian@lemmy.ca
          link
          fedilink
          arrow-up
          11
          ·
          1 day ago

          it lets the game developers focus on the game itself

          Downside to that is there isn’t a ton of people putting effort into efficiency/performance. And they sort of seem to be a dying breed at this point

        • JayDee@lemmy.sdf.org
          link
          fedilink
          arrow-up
          2
          ·
          22 hours ago

          Anyone who wants to know more about the exact craziness in retro game code should read “Racing the Beam: The Atari Video Computer System” by Ian Bogost and Nick Montfort.

        • Feydaikin@beehaw.org
          link
          fedilink
          arrow-up
          3
          ·
          1 day ago

          Yet, some of the most anticipated titles released are streamlined, soulless and boring. Every edge has been rounded off to such a degree, it makes Disney look gory.