There were a number of exciting announcements from Apple at WWDC 2024, from macOS Sequoia to Apple Intelligence. However, a subtle addition to Xcode 16 — the development environment for Apple platforms, like iOS and macOS — is a feature called Predictive Code Completion. Unfortunately, if you bought into Apple’s claim that 8GB of unified memory was enough for base-model Apple silicon Macs, you won’t be able to use it. There’s a memory requirement for Predictive Code Completion in Xcode 16, and it’s the closest thing we’ll get from Apple to an admission that 8GB of memory isn’t really enough for a new Mac in 2024.

  • nossaquesapao@lemmy.eco.br
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    1
    ·
    6 months ago

    First of all, 350MB is a drop in a bucket

    People don’t run just a single app in their machines. If we triple ram usage of several apps, it results in a massive increase. That’s how bloat happens, it’s a cumulative increase on everything. If we analyze single cases, we could say that they’re not that bad individually, but the end result is the necessity for a constant and fast increase in hardware resources.

    • Aux@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      6 months ago

      People don’t run just a single app in their machines

      That’s not bloat, that’s people running more apps than ever.

      the end result is the necessity for a constant and fast increase in hardware resources.

      That’s not true. 8 to 16GB RAM machines became common in early 2010-s and barely anyone is using 32 gigs today. Even if we look at the most recent Steam Hardware & Software Survey, we will see that even gamers are pretty much stuck with 16 gigs. 32 gigs are installed on less than 30% of machines and more than that is barely 4%. Ten years ago 8 gigs was the most common option with 12+ gigs (Steam didn’t have 16gig category in 2014) being the third option. The switch to 16 gigs being number one happened in December 2019, so we’re five years in with 16 gigs being the most common option and more RAM is not getting anywhere close to replacing it (47.08% for 16 gigs and 28.72% for 32 gigs as of May 2024).

      Now if you look at late 90-s and 2000-s you will see that RAM was doubling pretty much every 2-3 years. We can look at Steam data once again. Back in 2008 (that’s the earliest data available on archive.org) 2 gigs were the most common option. Next year 3 gigs option got very close and sat at 2nd place. In 2010 2GB, 3GB and 4GB were splitting hairs. 4GB option became the most common in 2011 with 3GB variant being very close 2nd place. 5GB option became the king in 2012. And the very next year 8 gigs became the norm.

      So, 2 gigs in 2008, 4 gigs in 2011 and 8 gigs in 2013. You can check historical data yourself here https://web.archive.org/web/20130915000000*/http://store.steampowered.com/hwsurvey/

      • nossaquesapao@lemmy.eco.br
        link
        fedilink
        English
        arrow-up
        2
        ·
        6 months ago

        That’s not bloat, that’s people running more apps than ever.

        Not necessarily. People used to write text documents while looking for references on the internet, listening to music and chatting with friends at the same time in 2010, and even earlier, but the same use case (office suite+browser+music payer+chat app) takes much more resources today, with just a small increase in usability and features.

        Bloat is a complicated thing to discuss, because there’s no hard definition of it, and each person will think about it in a different way, so what someone can consider bloat, someone else may not, and we end up talking about different things. You’re right that hardware resources have been increasing in a slower rate, and it may force some more optimizations, but a lot of software are still getting heavier, without bringing new functionalities.

        • Aux@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          6 months ago

          The software is getting heavier because content, not code. Again, we can look at the games. Take some old games like GTA V or Skyrim, they will fly on modern high end machines! Now add mods with 8K textures, higher definition models, HDR support, etc and these old games will bend over your RTX4090.

          • nossaquesapao@lemmy.eco.br
            link
            fedilink
            English
            arrow-up
            2
            ·
            6 months ago

            Content is also getting heavier, but both things aren’t mutually exclusive. It’s more objective to compare modern software, instead of older and newer ones. Before reddit created obstacles for third-party apps, they were famous for being much lighter than the official one, while doing the same (some even had more features). Now, if we compare lemmy to reddit, it’s also much lighter, while providing a very similar experience. Telegram has a desktop app that does everything the web version does, and more, while lighter on resources. Most linux distros will work fine with far less hardware resources than windows. If you install lineageos on an older phone, it will perform better than the stock rom, even while using a newer aosp version. If you play a video on youtube, and the same one on vlc, vlc will do the same with less resources. If you use most sites with and without content blockers, the second one will be lighter, while not losing anything important.

            I could go on and on, but that’s enough examples. There is a bloat component to software getting heavier, and not everything can be explained by heavier content and more features.