• horse@lemmy.world
    link
    fedilink
    English
    arrow-up
    12
    ·
    5 months ago

    There is exactly one reason why they do this: So they can charge you $200 to upgrade it to 16GB and in doing so make the listed price of the device look $200 cheaper than it actually is. Or sometimes $400 if it’s a model where the base model comes with a 256GB SSD (the upgrade to 512GB, the minimum I’d ever recommend, is also $200).

    The prices Apple charges for storage and RAM are plain offensive. And I say that as someone who enjoys using their stuff.

    • Jesus_666@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      5 months ago

      That’s why I dropped them when my mid-2013 MBP got a bit long in the tooth. Mac OS X, I mean OS X, I mean macOS is a nice enough OS but it’s not worth the extortionate prices for hardware that’s locked down even by ultralight laptop standards. Not even the impressive energy efficiency can save the value proposition for me.

      Sometimes I wish Apple hadn’t turned all of their notebook lines into MacBook Air variants. The unibody MBP line was amazing.

      • ebc@lemmy.ca
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        5 months ago

        Sometimes I wish Apple hadn’t turned all of their notebook lines into MacBook Air variants. The unibody MBP line was amazing.

        Typing this from a M2 Max Macbook Pro with 32GB, and honestly, this thing puts the “Pro” back in the MBP. It’s insanely powerful, I rarely have to wait for it to compile code, transcode video, or run AI stuff. It also does all of that while sipping battery, it’s not even breaking a sweat. Yes, it’s pretty thin, but it’s by no means underpowered. Apple really is onto something with their M* lineup.

        But yeah, selling “Pro” laptops with 8GB in 2024 is very stupid.

  • BilboBargains@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    ·
    5 months ago

    As engineers, we should never insert proprietary interfaces into our designs. We shouldn’t obfuscate the design.

    The motivation for these toxic practices comes from the business side because it’s profitable. These people won’t share the profits with you because they are psychopaths. Ultimately we are making more waste when electronics cannot be upgraded, maintained and repaired. It’s bad for people and it’s bad for the environment.

    • TheGrandNagus@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      edit-2
      5 months ago

      So much stuff in both the hardware and software world really annoys me and makes me think our future is shit the more I think about it.

      Things could be so much better. Pretty much everything could be open and standardised, yet it isn’t.

      Software can be made in a way that isn’t user-hostile, but that’s not the way of things. Hardware could be repairable and open, without OEMs having to navigate a minefield of IP and patents, much of which shouldn’t have been granted in the first place, or users having no ability to repair or upgrade their devices.

      It’s all so tiresome.

      • rottingleaf@lemmy.zip
        link
        fedilink
        English
        arrow-up
        1
        ·
        5 months ago

        I think Napoleon said something similar to “the army is commanded by me and the sergeants”?

        Well, not true anymore today. All this connectivity and processing power, however seemingly inefficiently they are used, allow to centralize the world more than it could ever be. No need to consider what sergeants think.

        (Which also means no Napoleons, cause much more average, grey, unskilled and generally unpleasant and uninteresting people are there now.)

        It’s about power and it happened in the last 15 years.

        I think it’s a political tendency, very intentional for those making decisions, not a “market failure” and other smartassery. It comes down to elites making laws. I feel they are more similar to Goering than to Hitler all over the world today.

        This post may seem nuts, but our daily lives significantly depend on things more complex and centralized in supply chains and expertise than nukes and spaceships.

        We don’t need desktop computers which can’t be fully made in, say, Italy, or at least in a few European countries taken together. Yes, this would mean kinda going back to late 90s at best in terms of computing power per PC, but we waste so much of it on useless things that our devices do less now than then.

        We trade a lot of unseen security for comfort.

  • Blackmist@feddit.uk
    link
    fedilink
    English
    arrow-up
    2
    ·
    5 months ago

    8GB RAM is what my phone has.

    Having that in a laptop shows what they think of people buying their kit. They think you’re only buying it so you can type easier on Facebook.

    • macrocephalic@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      5 months ago

      My phone was manufactured in 2022, cost under USD250, and has 8gb of ram. New phones generally come with 12gb or more.

        • KillingTimeItself@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          0
          ·
          5 months ago

          nothing that requires 8GB of ram lol.

          I’ve played the entirety of java minecraft on an old thinkpad with 4GB of ram. It didn’t crash (i dont use swap)

          There literally shouldn’t be anything capable of using that much memory.

          • IthronMorn@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            0
            ·
            5 months ago

            What about running a chrooted nix install and using a vnc to connect to it? While web browsing and playing a background video? Just because you don’t use your ram doesn’t mean others don’t. And no, I don’t use all my ram, but a little overhead is nice.

            • KillingTimeItself@lemmy.dbzer0.com
              link
              fedilink
              English
              arrow-up
              0
              ·
              5 months ago

              on a phone? I mean i suppose you could do that, but VNC is not a very slick remote access tool for anything other than, well, remote access. The latency and speed over WIFI would be a significant problem, i suppose you could stream from your phone to your TV, but again, most TVs that exist today are smart TVs so literally a non issue.

              my example here was using a computer rather than a phone, to show that even desktop computing tasks, don’t really use all that much ram.

              • IthronMorn@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                1
                ·
                5 months ago

                Well, then by that logic, since desktop computing tasks don’t really use all that ram: we shouldn’t need more than 8GB in a desktop ever. Yes, my example was a tad extreme, vnc-ing into your own VM on your phone, but my point was rather phones are becoming capable and replacing traditional computers more and more. A more realistic example is when I was using Samsung Dex the other day I had 80ish chrome tabs open, a video chat, and a terminal ssh’d into my computer fixing it. I liked the overhead of ram I had above me. Was I even close to 12GB? No. But it gave me room if I wanted another background program or had to spin something up quickly without disrupting my flow or lagging out/crashing.

                • KillingTimeItself@lemmy.dbzer0.com
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  5 months ago

                  Well, then by that logic, since desktop computing tasks don’t really use all that ram: we shouldn’t need more than 8GB in a desktop ever.

                  if this is the logic we’re using, then we shouldn’t have phones at all. Since clearly they do nothing more than a computer. Or we shouldn’t have desktops/laptops at all. Because clearly they do nothing more than a phone.

                  I understand that phones are more capable, my point is that they have no reason to be more capable. 99% of what you do on a phone is going to be the same whether you spend 200 dollars on it, or 2000.

          • greedytacothief@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            5 months ago

            Is this bait? Because like, you could be rendering, simulating, running virtual machines. Lots of stuff that aren’t web browsers also eat ram

              • dustyData@lemmy.world
                link
                fedilink
                English
                arrow-up
                0
                ·
                edit-2
                5 months ago

                My man, have you been to selfhosted? People are using smart phones for all kinds of crazy stuff. They are basically mini ARM computers. Particularly the flagships, they can do many things like editing video, rendering digital drawings, after they end their use life they can host adguards, do torrent to NAS, host nextcloud. You name it.

                • AdrianTheFrog@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  5 months ago

                  It sounds a lot more cost effective to get a used mini-pc than a flagship phone for any sort of server stuff.

                • woelkchen@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  5 months ago

                  it’s not like most people are chronically browsing the web on their phones.

                  Yes, they do.

            • AdrianTheFrog@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              ·
              5 months ago

              you could be rendering, simulating, running virtual machines

              On a phone? I guess you could, although 4gb is probably enough for any video game that any amount of people use.

              • woelkchen@lemmy.world
                link
                fedilink
                English
                arrow-up
                0
                ·
                5 months ago

                People use phone apps for photo and video editing these days. The common TikTok kid out there doesn’t use Adobe Premiere on a desktop workstation.

                Phone apps often are desktop applications with a specialized GUI these days.

                • KillingTimeItself@lemmy.dbzer0.com
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  edit-2
                  5 months ago

                  i mean yeah, but even then those aren’t significant filters, and what makes you think that tiktok isn’t running a render farm somewhere in china to collect shit tons of data? They’re already collecting the data, might as well provide a rendering service to make the UI nicer, but i don’t use tiktok so don’t quote me on it.

                  Those are also all built into tiktok, and im pretty sure tiktok doesn’t require 8GB of ram to open.

  • Veraxus@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    5 months ago

    My basic web dev Docker suite uses about 13GB just on its own, which - assuming you were on 16GB (double Apple’s minimum) - wouldn’t leave much for things like browser tabs, which also eat memory for breakfast.

    A fast swap is not an argument to short-change on RAM, especially since SSDs have a shorter lifespan than RAM modules. 16GB remains the absolute bare minimum for modern computing, and Apple is making weak, ridiculous excuses to pocket just a few extra bucks per MacBook.

  • sugar_in_your_tea@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    1
    ·
    5 months ago

    Well yeah, they’re enough to meet the minimum use cases so they can upsell most people on expensive RAM upgrades.

    That’s why I don’t buy laptops with soldered RAM. That’s getting harder and harder these days, but my needs for a laptop have also gone down. If they solder RAM, there’s nothing you can (realistically) do if you need more, so you’ll pay extra when buying so they can upcharge a lot. If it’s not soldered, you have a decent option to buy RAM afterward, so there’s less value in upselling too much.

    So screw you Apple, I’m not buying your products until they’re more repair friendly.

    • akilou@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      0
      ·
      5 months ago

      I had a extra stick of RAM available the other day so I went to open my wife’s Lenovo to see if it’d take it and the damn thing is screwed shut with the smallest torx screws I’ve ever seen, smaller than what I have. I was so annoyed

      • tal@lemmy.today
        link
        fedilink
        English
        arrow-up
        0
        ·
        5 months ago

        smallest torx screws I’ve ever seen

        Torx is legitimately useful for small screws, because it’s more resistant to stripping than Phillips.

        Now, if they start using Torx security bits or some oddball shapes, then they’re just being obnoxious. But there are not-trying-to-obstruct-the-customer reasons not to use Phillips.

  • Alien Nathan Edward@lemm.ee
    link
    fedilink
    English
    arrow-up
    1
    ·
    5 months ago

    Tim Apple be like “We’ve tried charging more money. Have we tried charging more money and delivering less stuff in exchange?”

    • goatman360@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      5 months ago

      Yes, they do constantly. Yet, people still keep buying. I hate that I have to use Apple for my job because of the software and interface is exclusive.

      • Alien Nathan Edward@lemm.ee
        link
        fedilink
        English
        arrow-up
        0
        ·
        5 months ago

        I really like my macbook for dev work, and I think that now that macos is essentially a linux distro it’s quite nice, but it’s not that much better than the free distros and it’s getting worse while they get better. Right now the only thing keeping me on a mac at work is that they gave it to me and the only thing keeping me on a mac at home is that it’s already paid for.

        • KillingTimeItself@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          0
          ·
          5 months ago

          you wanna expand on why you think it’s basically a linux distro? Last i heard macos was more closely based on BSD than it was linux, and this was ages ago. Unless they rewrote it without my knowledge it really shouldn’t be anything like either one of the two.

  • mightyfoolish@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    5 months ago

    I get upgrades help the bottom line but considering that 8GB of RAM chokes the silicon they are allegedly so proud of… seems like a slap in the face to their own engineers (and the customer as well but that is not my point).

  • anhydrous@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    5 months ago

    My X220 and T520 each have 16GB. The designed max was actually “only” 8GB, but it turns out 16 GB actually works. I replaced the RAM modules myself without asking Lenovo for permission. Those models came out in 2011.

  • phoenixz@lemmy.ca
    link
    fedilink
    English
    arrow-up
    0
    ·
    5 months ago

    Granted, I’m a developer and my dev ide already uses a good 10+GB, I have probably hundreds of tabs and windows open over 6 desktops… But I got 64GB, and I’m considering upgrading to 128, and these clowns think 8 is okay today? My development laptop of like 10 years ago has 8GB

    • datelmd5sum@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      5 months ago

      I have 16GB and I have to run shit I dev on local k8s. I have to close teams and my browser to get enough ram sometimes.

      • phoenixz@lemmy.ca
        link
        fedilink
        English
        arrow-up
        1
        ·
        5 months ago

        Buy more memory, if you have the financial means to do so. If not then I’m sorry you’re in that situation

  • mechoman444@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    5 months ago

    I mean. It makes sense. The vast majority of people buying apple computers are loyalists or people that simply need an Internet/word processor.

    And if you want to develop in apple then you have to spend a massive premium for their higher end hardware.

    • AdrianTheFrog@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      5 months ago

      Their CPUs are actually really good now, when the apps are actually optimized for them. Especially in single core, they are very competitive with top Intel or AMD chips while being way more power efficient.

      ex: in Geekbench 5.1 single core the M2 max gets 1967 points (85%) compared to 2311 points from the 7950X3D and 2369 from the 14900k. The M2 max (12 cores (8 p + 4 e), 12 threads) can draw a maximum of 36 watts while the 7950X3D (16 cores, 32 threads) can draw around 250 watts, and the 14900k (16 cores (8 p + 16 e), 32 threads) can draw around 350 watts.

      Apple’s GPUs are definitely lacking though, in terms of performance.

  • kamen@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    5 months ago

    Yeah, sure. Even if what they say about the OS resource usage is true, it’s only a fraction of the total usage. A lot of the multiplatform software will use the same resources regardless of the OS. Many apps eat RAM for breakfast, doesn’t matter if it’s content creation or software development. Heck, even smartphones these days have have this much or more RAM.

    I won’t argue, I just won’t buy an Apple product in the near future or probably ever at all.

    • KillingTimeItself@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      1
      ·
      5 months ago

      buys [insert price] laptop, top of the line, flagship, custom silicon, built ground up to be purpose specific.

      Opens final cut pro: crashes

      ok…

      • Retrograde@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        5 months ago

        Especially paired with Apple’s 128gb integrated, non replaceable hard drives. Whoops you installed all of Microsoft office? Looks like you have no room to save any documents :(

        • KillingTimeItself@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          0
          ·
          5 months ago

          ah yes, we can’t forget the proprietary non controller based nvme drives that use m.2 but arent actually nvme drives, they’re just flash.

            • KillingTimeItself@lemmy.dbzer0.com
              link
              fedilink
              English
              arrow-up
              0
              ·
              edit-2
              5 months ago

              it’s NVME in the sense that it’s non volatile flash, probably even higher quality than most existing NVME ssds out there today.

              The thing is that it literally just the flash. On a card with an m.2 pin out, that fits into an m.2 slot, it doesn’t have a storage controller or any standardized method of communication, that already exists. It’s literally a proprietary non standard standard form factor SSD.

              The controller is integrated onto the silicon chip die itself, there is no storage controller on the storage itself.