Since these keep floating up, I'm already exhausted with this specific rebuttal to Morehead's review:
> As pointed out in John Gruber’s hard takedown of Moorhead’s piece, most of the problems he encountered were software problems, not M1 hardware issues.
Morehead's allegiances and work are relevant biases, as are Gruber's. But Morehead's review wasn't of the M1 processor or architecture as a technology. It was a review of the 13" M1 MacBook Pro as an end-user device.
Software problems _are_ worth raising in a review of who should buy an M1 MacBook, and why someone shouldn't.
Because of the architecture change, the M1 has to come up — to omit its role in requiring Rosetta would be lying through omission. It's not an indictment of the technology, and shouldn't be controversial, for an end-user review to say the first OS revision targeting the first generation of a piece of hardware maybe isn't ready for every task thrown at it, especially those that rely on the newest of the new software components. It's why my workplace isn't even mostly on Catalina yet, much less Big Sur.
For the love of whatever stupid internet points people are trying to win, let Morehead's review go. It's contributor report on Forbes, which makes it only slightly more relevant than a Reddit post. Folks can absolutely find better things to squabble over about these computers.
> Morehead's allegiances and work are relevant biases, as are Gruber's. But Morehead's review wasn't of the M1 processor or architecture as a technology. It was a review of the 13" M1 MacBook Pro as an end-user device.
Moorhead's review literally contains this sentence:
> The new M1 processor is impressive, but far from perfect- it has many warts, that nearly nobody is discussing.
He then proceeds to describe not warts of the M1 as a processor, but rather warts of the 3rd party software ecosystem. I think overall the review is fine and not in bad faith, but it's easy to see how people are latching on to that sentence. If he were honest about making it a review about the overall product experience, rather than the silicon, he wouldn't need to write that sentence.
> rather warts of the 3rd party software ecosystem.
which is the result of the change in processor. There is, of course, an undertone of bias in the review against the M1.
It's like someone reviewing a tesla car and complain that it's difficult to charge because there's not enough chargers in the road networks and gas stations don't cater for electric vehicles. A valid complaint about the tesla car, but not caused directly by the quality/capability of the car itself.
Yeah, the question is always for whom is this review? For my mom that distinction is irrelevant and can be ignored. For myself, it's crucial. I hate seeing CPU benchmarks where they don't compile the compiler for that u-arch first then the run it on Firefox, and then test the resulting because that tells you the actually capabilities of the chip not just how it performs for existing binaries that will be updated within 6 months most likely.
A reviewer cannot in good faith separate M1 silicon from its end-user product experience, because it is not a standalone product. M1 is not available off the shelf. It is only available in a couple of small laptops and the Mac Mini.
It's easy to rewrite that sentence to make that separation clear, and he doesn't. Gruber goes into detail about the author's history of being wrong and misleading in the past regarding Apple's processors, so in that context I think it's fair to take that part of the review in bad faith.
Morehead isn't asking us to take his word for it. He used the device for a few days and shared his review as a counterpoint explicitly intended to balance against the pro-M1 hype. I thought it was a pretty nice assessment because Morehead focused on product usability instead of raw benchmarks, and I used his review as part of the process of informing my recent decision to buy an Intel MBP13 rather than an M1 model.
It seems obvious that Morehead's opinion seven years ago about 64bit being unnecessary in mobile was incorrect, mainly because today's mainstream phones ship with more RAM than 32bit would support, but that doesn't mean every word Morehead types is now ritually unclean. He was wrong about something; it hurts his credibility but doesn't banish him from the conversation.
Besides, unless we think Morehead doctored his screenshots, it's pretty hard to argue that his review is nonfactual in its entirety.
More specifically to your point, I don't see why there has to be a separation between hardware quality and software experience when the user (A) can't buy the hardware without paying for the OS too, and (B) can't use the hardware without using the OS too.
If you just bought a new Intel MBP then you got a horrible deal by comparison. I also didn't say the review was nonfactual, just that it contains within it an inaccurate framing and conflation of M1 as a processor and the overall product. It's worth noting though that many others' experiences with it, and Rosetta 2 specifically, were quite different than Moorhead's, so unless you're using the specific set of software Moorhead mentioned in his review, it's very possible you would have been better served by an M1, even today.
“But Morehead's review wasn't of the M1 processor or architecture as a technology. It was a review of the 13" M1 MacBook Pro as an end-user device.”
Possibly, but if it was, it shouldn’t have a sentence “The new M1 processor is impressive, but far from perfect- it has many warts, that nearly nobody is discussing” in the second paragraph of the article.
Take that entire second paragraph away, and I don’t think we would have this discussion now. I thought it is bad writing, but am not sure anymore. It also could be a very good attempt at getting lots of page views.
The pdf Jean-Louis Gassée links to in the 2nd paragraph is an excellent source for the evolution of Apple silicon up until the A12X:
> An Hungarian researcher by the name of Dezsö Sima offers an exhaustive history of Apple processors [1] that shows the transition to homegrown cores clearly started with the A6 device.
Extrapolate what a hypothetical 4+4 big.LITTLE A14X with 8 GPU cores would look like compared to the 2+4 A14 with 4 GPU cores and you don't need anything other than the progress of past Apple silicon SoCs to predict the characteristics of the M1.
Apple Silicon is outpacing the competition on all fronts. Qualcomm, Samsung, Intel, and AMD are not keeping up. Huawei has been knocked out of the competition, mostly by geopolitics as far as I can tell. The only question was the timing of the MacOS pivot. The A12X was good enough for a MacBook Air but a compromise for a MacBook Pro. The M1 is a no-compromise SoC for MacOS and the only question is the execution of the software transition; the focus of Gruber and Moorhead's spat.
I think the reason it keeps coming up is that it is the most prominent bad review, but due to Moorhead’s allegiances, poor calls in the past, and eclectic selection of software, his negative review comes across as being in bad faith.
Yes, but he was also the only one who had the guts to stand up against Apple in the Apple vs Qualcomm Trial. If I had believed everything the media and Tim Cook portrayed I would have been completely on Apple's side.
It's part of the Forbes contributor network. Bad faith is kind of assumed.
edit: Downvote if you want, I'm not kidding. Disingenuous hot takes without much connection to reality is a feature over at Forbes when it comes to tech, not a bug. They're literally traffic mercenaries over there. I'm sure Moorhead is smarting maybe little from the internet anger, but also not too upset to collect a nice bonus before Christmas. I'm sure another Forbes contributor will reprise the role of shit-stirrer-in-chief for the inevitable M2 as well.
> For the love of whatever stupid internet points people are trying to win, let Morehead's review go.
Do you not think it's the least bit concerning that Moorhead has Intel, AMD and NVIDIA as his clients?
Do you really think he can write an unbiased piece about a competitor of theirs when they put food on his table?
(EDIT: For reference, I'm in the 4x4/travel world and write for a stack of magazines. Big car OEMs fly journalists all around the world, put them in expensive hotels, feed them 5 course meals and then have them review their new vehicles. I want you to guess what percentage of those reviews are absolutely glowing. )
I have only one issue with the M1; I do have a M1 mini to test/play with. Apple has to get developers to actually make apps for it. Their presentation debut of the M1 was notable for the fact how many developers they had on display you had to use Google to find out who they were!
Catalina obsoleted a lot of apps that many companies did not bother to update before or after which does not bode well for Apple Silicon. The hope to hold out for is that the performance is so good that people really want the systems and that brings out the developers.
Still, you can still get acceptable Windows Laptops with SSD/Etc for half the price of a MBA and that still means a lot. That you don't have to look twice at software to know if it runs is a big deal as well
If the M1 proves anything at all it’s that ARM 64 can and will be a performance chip not just a chip that is strictly intended for low power devices. I think that’s what the M1 will finally signal to all the other vendors, even though many have been saying this for a long time it’s proof in the wild that you can build desktop class ARM chips.
My other thought is I wonder if this will open up the idea of vendors support specific co-processors analogous to how Apple has things like the T1, some offloading to a specific chip for ML etc.
This is exciting if more vendors do this, I predict the first one to do it in an open way will win the day long term as it will be attractive to all upstream manufacturers like Dell etc
I don’t think we know that for sure yet. Firstly, Apple has access to better chip technology. That gives them extra room and extra speed for free (actually, for a lot of dollars)
Secondly, AFAIK, nobody has attempted something similar with a different CPU. It is possible (but IMHO not likely), that a similar approach as the M1 (optimize a system , not individual chips), but with x64 (or x86, or MIPS, or RISC-V) would outperform the ARM version.
“Desktop-class” is a moving target, so if that were the case, the M1 wouldn’t be called desktop class anymore.
>Later, with the A11, previously licensed PowerVR GPUs were also replaced with in-house designs.
They are, as of A14, and likely in A15 and A16, still licensed from IMG / PowerVR. [1] [2] Arguing the GPU is not from PowerVR because they have their own variation of design is like arguing Apple doesn't use ARM because they design their own CPU. They are still using tile based rendering, patented by IMG.
And for much of the M1 discussions, People want to trust what they want to trust, and believe what they want to believe. Even with science and tech.
[1] “London, UK; 2nd January 2020 – Imagination Technologies (“Imagination”) announces that it has replaced the multi-year, multi-use license agreement with Apple, first announced on February 6, 2014, with a new multi-year license agreement under which Apple has access to a wider range of Imagination’s intellectual property in exchange for license fees.”
[2] In 2019 IMG's new CEO also went on record and said Apple has never not been a IMG client. Although Google results these days are generally littered with crap I can no longer find that pieces of information.
Nvidia and AMD were both using tile based rasterization with deferred rendering which is different to tile based rendering.
Correct me if I am wrong, as I dont follow GPU that closely any more. ( Because I believe we have hit a point where processing node matters a lot more than uArch on GPU as their performance scales very well with transistor count. )
I have seen so many articles on HN in front page about M1 over the past few weeks. Like a curious child I read all of them because somewhere in me I want to believe there exists a possibility of a giant leap for Desktop/Laptop. Having tried it myself, I should say this is going to be remembered just like the iPhone in 2007.
I don't think it will be remembered like the iPhone. Probably more like the original Macbook Air.
The original Air created the class of ultrabooks which made the bulky Windows laptops at the time obsolete for average users.
The new M1 Air has created a class of laptops with ultrabook bodies, but remains cool to the touch under load without sacrificing much in performance and having 16+ hours of battery life.
Like the original Air the new one has basically made every ~$1,000 Windows laptop obsolete. Even really nice ones like the Dell XPS 13 have worse battery life and performance while reaching CPU temperatures as high as 100 °C.
> Like the original Air the new one has basically made every ~$1,000 Windows laptop obsolete.
This price qualification is important and insightful, and has been ignored by all the breathless prose about how it will replace all (or no) PCs.
The real historic lesson that applies here: Apple by and large aims to serve the highest margin segments and are happy to ignore other segments, even higher revenue ones.*
M1 will help them dominate a couple of other segments; everyone else will fight over the scraps.
By contrast, a Dell or LG tries to serve all but a tiny sliver of the market, which is more expensive and difficult.
* sometimes they have halo product like the Mac Pro that might better be thought of as part of the marketing budget, though I’m sure they’re pretty high margin too.
I think the M1 is more analogous to the release of the 64-bit A7 CPU. Which doesn’t diminish it’s importance at all. The A7 was the inflection point that had the competition scrambling [unsuccessfully] to catch up to Apple ever since.
It's also worth remembering that while it may have shifted the smartphone market (and certainly Apple tablets), competitors still exist, improving over time, remaining relevant. Would someone argue that Android is simply not an option as a smartphone for a very large portion of the smartphone market? No.
And likewise, competitors to Apple's laptops and workstations are very much going to continue to be relevant, and to improve and compete. The M1 has shown that power, battery life, and minimal heat production are more possible than ever. That doesn't mean end users will all suddenly drop everything and stop buying products from the competition. They'll just raise their expectations for future revisions.
It's also worth remembering that while it may have shifted the smartphone market (and certainly Apple tablets), competitors still exist, improving over time, remaining relevant.
Apple has captured 66% [1] of the profit in the cell phone industry. Other than Samsung in the US and other manufacturers in Asia, Apple has made the other companies a lot less relevant. When the iPhone shipped in 2007, Windows Phone, Blackberry, Nokia, Palm, Motorola and others were the giants in the industry; I'd argue none of what's left of these companies is relevant today.
The same thing will happen with laptops; we already caught a glimpse of this in Japan, where Apple became the #1 selling desktop when the M1 Mac mini started shipping [2].
As the price/performance and performance/watt becomes more obvious, Apple will make inroads with new customers and markets.
I certainly don't think I'd want to be a competitor to Apple. (I'm a consumer, instead.) Of course, the landscape has changed, and the iPhone absolutely disrupted the industry. These days, I believe the only big Android options are Google, Samsung, and a handful of smaller players such as (the new) Motorola, LG, OnePlus, etc. As a consumer, I don't really care (for now) who profits or doesn't - I'm still deciding between all these phones. (Others only consider Apple or not Apple.)
The discussion above was specifically about how the advance in performance of the A7[0] in 2013 changed things. Honestly I don't know the numbers for market or profit share in 2013 to compare to now, but for consumers the decisions are not wildly different, except for someone specifically comparing phones on CPU performance alone, and ignoring the other differences. Of course, some will also be affected by Apple's marketing of "the fastest smartphones" and that helps with selling phones.
To apply this to laptops is, in my opinion, much trickier. Now I can't speak for all consumers considering purchasing a laptop, but as technical as I am, performance/watt is not my number one consideration. It kind of is, indirectly, an important one. I don't want a hot, loud laptop, and I want some battery life. But things like playing demanding games, or compiling code with as many cores as I can throw at it are more important to me. That and price, being able to upgrade RAM and SSDs when the need arises, etc.
So I have a $950 laptop with a dedicated GPU, 8 cores, 16GB RAM, 1TB NVME SSD, and I can add RAM and another NVME SSD when the need arises. Of course, I only get about 4 hours of battery life, and I'd love more, but it isn't going to change my life much either way (it's freezing outside, I am almost always within range of an outlet!), and I can't give up all the other things that this laptop has that an M1 Macbook lacks, even if the performance/watt is much better.
I have an M1 air 8-core 16gb ram model and yes, yes it will. I said something similar when it launched and people dismissed it. Article upon article online of PC die-hards dismissing it, Apple fanboys praising it.
It’s a leap for sure. It’s a really really good chip. We’ll look back from 2027 and see.
The iPhone was a basically new form factor that over the course of the next 13 years (after trickling out to the lower priced Android phones), became the primary way that billions of people interact with technology.
The M1 is a really good new processor. We've been doing really good new processors for the last couple decades. It's great to have more competition in the CPU space, which was seeming to stagnate, but not matter how big an iteration, it's still just an iteration, it won't be remembered like coming up with a new type of device.
True, it’s not a new form factor. It doesn’t have that shiny “new thing” feeling. My Air looks like every other 2018+ air. What’s different is what you don’t see. Being able to code all day on battery and only using 18%. The CPU-core power to watt ratio is just unbelievable. I get it though, without the new form factor it lacks that defining moment.
The moment came for me when I realized I wasn’t plugged in all day and was doing some pretty heavy compilation and code (aarm64 so gotta build from source).
It’s a smashing breakthrough in pretty much all areas other than form factor. The changes are subtle but you then shake your head and remember your on a MacBook air...
If the battery life is the main point, why were thinkpads with slice batteries and powerbridge ("up to 30 hours" in 2018) not more successful?
I think it is more about the fact that apple successfully makes things "mainstream". For instance, the air form factor had been done before by Sony (and those even had hires screens), but only got popular after the air.
Extreme battery life + extreme performance is the main point!
The Thinkpads with extra battery weren't in the same "small and light" class as the Air and MBP 13. These "small and light" laptops with 20 hour video playback are faster than current workstation class laptops!
Yeah, I push mine hard and it barely uses battery. I never hear the fan (I'm on the M1 MBP 13) and it remains cool to the touch. I was skeptical about it until I noticed that it stayed this way for...basically 13 hours when pushing it really hard and nearly the 20 hours they advertise when using it for browsing/email/videos.
Yup, I've never experienced that before on any machine, Apple or otherwise. It's either you're tethered and burning your crotch or you're mobile and throttled. I haven't felt that yet with the M1 and my crotch stays cool.
It isn't just the "new thing" feeling, the smartphone form factor opened up new computing tasks that weren't really possible previously.
The M1 is a quantitative, not qualitative improvement. That's why it won't be as memorable.
The battery life is impressive, but it should be noted that you could have essentially the same experience with an older device by keeping the heavy computations on another machine -- remote desktop has always been a pain of course, but everybody loves SSH, right?
> The M1 is a quantitative, not qualitative improvement. That's why it won't be as memorable.
You're missing the forest for the trees. The quantitative improvements allow for advances in the qualitative experience. GP pointed out feeling impressed by coding all day and spending just 18% battery life. That's a qualitative experience. M1 (and chips like it that we'll see come up) will allow for a qualitative transformation like that. I remember when, if I had to get a powerful computer in the $1000 range, I'd either have to give up some power to get portability and battery life, or get a bulky powerful laptop with ~2-3 hours of battery life. With the M1 macs, you can get both, and that will set expectations that the rest of the market will have to follow if they want to survive. It's not as "revolutionary" as a new form factor, but it's a step towards a broader change in portable computing.
> It's not as "revolutionary" as a new form factor, but it's a step towards a broader change in portable computing.
Sure, the claim's been walked back from "as memorable as the iPhone" to "a step towards a broader change in portable computing." At this point I guess I agree with that level of importance.
So, as some context, I use Apple, Windows, and Linux systems all in good measure for work, but mostly Apple and Linux. I'm excited to see the M1 Air for all sorts of reasons. I'm excited to see ARM providing hard competition in this space, and excited to see the Air/ultrabook form factor get attention again. I'm also frustrated because I'm tired of seeing these sorts of developments be leveraged to push all sorts of problems with DRM and privacy. Ideally I'd like to see this same thing be offered but with linux or BSD, or at least under owner control.
Having said all that, anything pertaining to battery life I'm skeptical of. In the products I've used, the battery life always declines over time, and in my experience Apple products have been the worst. I love my Macbook Air, but similar dramatic improvements in battery life were touted at the time I bought mine, and I've not been impressed over the years. Sure, at first, the battery life was amazing, but then not too long after I obtained it (a few months or so) the battery life dramatically declined. My Dells, in contrast, started out with less impressive battery use, but declined much less, so after about a year, the battery life was about the same: the Apples just started out great and declined dramatically, and the Dells started out less great and declined less.
There's reasons to think this might be different, but so far the devices haven't been around enough to really speak to me about the long-term battery life patterns. I'm hopeful and open-minded but also skeptical.
The iPhone was revolutionary because of the responsive capacitive touchscreen with onscreen keyboard and shortly after the App Store which popularized more the iPhone. The M1 chip makes the Apple software run smoother and the battery efficiency is impressive, but not sure if it's something consumers will remember as an inflection point.
The iPhone literally changed everyone's life. History books will differentiae between the pre and post smartphone eras. This is nowhere close to a lifechanging product, although it's definitely a big leap.
I thought about this for a moment and realised it is probably correct, but not in a good way
Previously we used durable, work-orientated smartphones with productive keyboard input. Then the iPhone arrived and popularised a flashy but fragile and less efficient form factor that became dominant.
Every time I type in a touchscreen phone I regret that change. It was a triumph of a consumption mindset over usability.
> It was a triumph of a consumption mindset over usability.
Touché. At the time the iPhone launched I had a Nokia N91. I later got an N95 and finally a N900, after which I've only used non-productive touchscreen devices.
Speaking of which, I've been following and drooling a bit after the Gemini PDA and its successors (Cosmo Communicator) but now with their latest Astro Slide I'm actually considering biting the bullet and getting one, thanks to the new keyboard mechanism which doesn't require you to open the device to see the screen: https://www.indiegogo.com/projects/astro-slide-5g-transforme...
Is that really the case? Apple's design of the CPU itself seems like it's a bigger unique factor than the fab. Doesn't TSMC make competitors chips that are benchmarking much poorer than M1?
Qualcomm got rid of their mobile ARM CPU designs a while back. Can the snapdragon cx / SQ2 series hope to compete using ARM reference designs, with Apple out so far ahead?
We will only know once AMD APUs on TSMC 5nm will come out. If you look at 7nm options, A12Z and AMD Ryzen 7 4800U, the former has better single thread, but the later has much higher multicore.
Unfortunately, I could not find any data on TDP of A12Z under full load, so it is impossible to guesstimate at the moment which design is superior.
If you compare Snapdragon scores of the same TSMC node to their Apple equivalent, they come up wanting.
TSMC is a fantastic and innovative company, and they've done amazing things (in no small part to Apple's massive investment in them). But Apple chip design is superior to their competitors.
Combine Apple's superior designs with first access to TSMC's premier nodes and you get some pretty special results, as we've been seeing for years with the A series chips, and which is finally getting the notoriety it deserves with this M series debut.
Apple made it possible for TSMC to be able to do this. It's a tight partnership, and you don't understand the relationship between the two if you think TSMC's accomplishments come in isolation.
Your sentiment is better expressed as "I hope TSMC gets some credit for the role they've played."
Would recommend reading "Mobile Unleashed" by Daniel Nenni if you want to know more about ARM history in general.
> Appreciating the iPhone is largely done in hindsight.
I’m sorry but that’s definitely not true. Yes, there were a vocal minority of folks who said the iPhone was nothing special but the mass market response was huge and it was heralded as a huge deal by the vast majority.
And to go back to my original point: there’s a difference between “this new invention allows you to do new things, but those things aren’t important” and “this new invention does not allow you to do new things”. Reaction to the iPhone was the former. My reaction to the M1 is the latter. It allows you to do the things you already do faster and cooler and it’s a big technological shift. But it isn’t going to change the way people use computers in the way the iPhone did.
My reaction to the M1 is the latter. It allows you to do the things you already do faster and cooler and it’s a big technological shift. But it isn’t going to change the way people use computers in the way the iPhone did.
I think the point is that they have a single unified platform now (or soon). I think MS was right to have the Surface run windows and build in touch to Windows. But that never really took off.
Now, if I’m not mistaken, the M1 is giving you full performance for a work day running off a battery. And you can run iOS apps on Big Sur.
I think Apple will pull off what MS couldn’t. Single platform, multiple form factors, all capable of running the same software because the hardware is literally the same. Without just shoving everything into a web browser, because native apps are almost always better. And Apple has all the pieces of the puzzle: desktop, laptops, tablets, phones.
To be fair, if you remember 2007, the first generation was lacking in many areas. You couldn't send MMS messages. Receiving MMS was awkward and was done through an AT&T web site. You couldn't install apps. You couldn't take videos, only photos. It was slow, etc.
My first iPhone was a 3G, but felt the iPhone really started taking off with the iPhone 4. It felt like a huge leap.
It won't reinvent UI, but it signaled the change to ARM and by extension, RISC.
To a normal consumer, that's not a big deal (although 17 hour battery life is already pretty huge). To future PC hardware and server/backend infrastructure, that's a huge deal.
Oh, for sure. It’s going to be a big deal to use techies and represents a huge shift in terms of who holds the power in processing. (that said, the vast majority of PCs won’t be able to use it!)
But entire industries (e.g. ride hailing) were created in the aftermath of the iPhone’s launch. I just don’t think we’ll see that kind of change reflected in most people’s lives.
The big boosts here are commodity ARM instances on the cloud. These instances already are more power efficient than their intel counterpart and cheaper - but developer machine used to be all intel. Now that has changed.
It’s not the same, but this is significantly accelerating the end of x86, and the end of Intel (at least the end of the current dominance of both). Such shifts don’t come along that often.
Sure, but I didn’t mean that the whole world will switch to Apple. The world will mostly switch to commodity ARM chips and Intel's margins will crumble. Might take 5-10 years.
This is Intel’s Nokia moment, when they lost a dominant position to Samsung/Android. Apple was the catalyst there as well, but Apple did not directly kill Nokia. The industry shift Apple sparked killed them.
I disagree... laptops and desktops have been “dead” forever. The release cycles are boring and the computers even more so. This chip is a big change - Apple came out of the gate offering it in 3 form factors. That’s meaningful.
For the task worker populations at my employer, we only committed serious cash to upgrades when Windows 10 started cutting off support. The funny thing is in some ways the newer devices are slower than the old Haswell stuff. Cheaper laptops are lighter but have awful thermals.
Benchmarks show that the M1 has better performance in only a few areas, and underperforms in others, and that's with Apple controlling literally every aspect of the hardware and software stack.
It's great for an ARM system for a vertically oriented company, but it isn't sufficiently performative to be a significant threat to the competition. And that's before you consider all the non-technical crap Apple has thrown at its customers recently.
And of course, the big issue is that in order to use the M1 you must switch your entire stack to Apple. Considering the broad swaths of essential business software that is simply incompatible with Apple OS, that's not going to happen.
Benchmarks show that the M1 has better performance in only a few areas, and underperforms in others, and that's with Apple controlling literally every aspect of the hardware and software stack.
I haven't seen a review where it has underperformed any other comparable device.
I've seen a lot of talk about how the Threadripper from AMD is faster… duh. Of course a 32- or 64-core processor that costs 2x-3x than a Mac mini and draws 280 watts is faster than an M1.
The M1 peaks at 20-24 watts and doesn't even require a fan in the MacBook Air.
It's graphic blows away all integrated graphics and is comparable to many discrete GPUs. On CPU benchmarks, it's faster than all but the latest Intel and AMD laptop CPUs, which are in laptops that run hotter, are louder and don't have 18-20 battery life doing real world things.
x86 applications generally just work and often run faster on an M1 Mac than they did on Intel Macs.
Considering the broad swaths of essential business software that is simply incompatible with Apple OS, that's not going to happen.
Depends on what business you're in. If you're in the content creation business, you're good to go [1].
Microsoft Office is already in beta for running natively on M1 and so much other business software is web-based or Electron apps (like Slack) and those work fine.
In some of the benchmarks, the 15W Ryzen 7 4800U outperforms the Apple Silicon M1.
There, now you have seen a review where the M1 underperformed a comparable device.
(This is not to take away from the fact that the M1 is impressive, and is incredibly efficient, and much better for battery life. It's just a reminder that there are actual, real, worthwhile competitors.)
Also, a just-published database of gaming performance on the M1s confirms that the M1 is not the revolutionary chip that Apple claims it to be once you run it in uncontrolled (aka real-world) use cases.
The M1 struggles to run 4-year old games that modern AMD and Intel chips breeze through. While the M1 appears to have good burst capabilities, its ability to handle extended processing loads for anything that isn't hardcoded into the chip (aka, anything other than video processing) is woefully deficient. Whether that's a function of the CPU design, or poor cooling, or both, is not yet clear.
> Overall, Apple doesn’t just deliver a viable silicon alternative to AMD and Intel, but actually something that’s well outperforms them both in absolute performance as well as power efficiency.
Just to be clear, we want to ignore the benchmarks where AMD outperforms Apple, and then go on the wording of the summary to misconstrue reality.
Yes, in some benchmarks, the M1 outperforms... in absolute performance but if you only look at that wording, you miss all the cases where the M1 competitors outperform it... in absolute performance. That's important, only because there seems to be a new commonly held belief by the faithful that "no CPUs compete with the M1 in any sense of the word." That's an inaccurate belief.
Remember the original statement I responded to:
> I haven't seen a review where it has underperformed any other comparable device.
Depends on what business you're in. If you're in the content creation business, you're good to go [1].
I am definitely going to differ on that. My last job (until this summer) was in Hollywood for a content creation company (live broadcast + streaming) and I worked very heavily with the content creation guys and the digital production team (as a result of working on them for R&D tax credit documentation purposes. That means I not only saw every expense, I also documented the reasons for all of those expenses based on discussions with the teams). We switched off Apple software 4 years ago. They began the transition off Apple hardware 2 years ago. Apple was great for content creation a long time ago, but these days content creators use *nix based systems if they need performance and Windows-based systems if they care about costs. If you suggest using Apple to the content creation guys these days, they'll laugh your ass out of the room and tell you to go back to film school.
These guys didn't choose to switch away from Mac because they wanted to complicate their lives and have to learn a new OS; they did so because the switchover was industry-wide and and they'd already made the switch at their previous jobs. The studios got burned by Apple's lack of support for a decade, combined with the Mac's excessive costs and switched to nix and Windows (primarily nix) starting during the Great Recession.
Generally, Windows is in the mix simply because a lot of the software that the (non-technical) artists use is stabler or easier to support on Windows than it is on nix, like Maya, 3DS, Premiere, etc., and running these on Windows is a quarter of the cost, or less, of running the same software on Mac.
Maya, 3DS, Premiere, etc., and running these on Windows is a quarter of the cost, or less, of running the same software on Mac.
I'd love to see a study on this; the total cost of ownership has always been far less for the Mac than Windows, especially when you factor in the amount of support Windows requires compared to the Mac.
An M1 Mac mini starts at $699, so there's no excessive cost. As usual, Adobe doesn't have it apps ready for Apple's latest and greatest… but other vendors are on the case and it'll be just a matter of time before all major video apps are running natively on the M1.
Of course, FinalCut Pro (and the rest of Apple's suite) is good to go; DaVinci Resolve (editing 4k, 6k and 8k RAW video) is also looking pretty great: https://www.youtube.com/watch?v=HxH3RabNWfE
I'd love to see a study on this; the total cost of ownership has always been far less for the Mac than Windows, especially when you factor in the amount of support Windows requires compared to the Mac.
This is simply not true, based on real-world experience. (Again, I did R&D tax credits for 4 years for a media company; I had line-item access to expenses and the reasons for every expense.) Macs cost more than Windows PCs of the same or better performance, are more expensive to replace (since usually the entire device must be replaced), and generally cannot be upgraded. Moreover, the fact that Macs may last years longer is irrelevant to a business, since the "useful life" for computer equipment for financial purposes and actual use is 5 years or less (emphasis on the "or less"; we replaced machines at least every other year with newer equipment).
An M1 Mac mini starts at $699, so there's no excessive cost.
So what? An M1 Mac mini can't run any of the software that you would be running in a video production environment, let alone one requiring real-time editing capabilities for a live broadcast.
Of course, FinalCut Pro (and the rest of Apple's suite) is good to go; DaVinci Resolve (editing 4k, 6k and 8k RAW video) is also looking pretty great:
Final Cut Pro has not been industry standard since before I started at my old job. Apple did a great job of screwing the pooch on that one. As for DaVinci Resolve, it runs even better on Windows, especially since you can put as much RAM as you want, and multiple GPUs, and whatever cooling system you want, and make the machine literally run laps around the top-of-the-line Mac.
Your link says otherwise. These would make it completely unusable... I'm not sure what they were even testing...
> First of all, there is no video or sound driver, so we were stuck with a maximum resolution of 1024 x 768. The network drivers apparently don't totally register with Windows
Yes, compared to your prior Macbook I'm sure it was a revelation. A programmer friend loves his new M1, but he's also been an Apple user for decades.
As a Windows user, testing out my friend's Macbook Air M1 was an underwhelming experience. I've had bigger performance boosts from adding more RAM or switching out HDs to SSDs, and a large part of the M1's performance "gains" can be attributed to the increased RAM on the systems currently featuring the M1.
What are you even talking about? RAM is actually one of the few disappointing areas of these machines, capacity wise (though the RAM itself is fast and unified, so there's that).
Furthermore, one of the most stunning things about these devices is their long running battery life, which you'd hardly get much experience with in a quick test drive.
How did you compare a Windows system with a Mac to conclude it was "an underwhelming experience"? were they similarly configured and running similar software?
> a large part of the M1's performance "gains" can be attributed to the increased RAM on the systems currently featuring the M1.
Not sure I understand your point. M1 systems have 8 or 16 GB RAM. There are no increased RAM compared to non-M1 Mac.
I'll try to find the review in the morning if I remember/care enough, but at least one of the reviews noted that the tested system had more RAM than the slightly older Mac it was being compared to, and that the performance gains between the two systems generally mirrored those found in PCs with differing amounts of RAM.
For a counter argument, I don't think M1 is going to disrupt anything. Its a great performer for sure, but its locked to a proprietary and closed ecosystem controlled by a single entity.
Its a new direction for Apple in their laptop architecture, and a solid first step, but the raw performance is not up to the fastest from AMD and Intel, and it lacks the ability to upgrade any of its components. I get it, its a laptop, but if you want to talk about performance lets put it out there: its a fast laptop.
People already in the Apple ecosystem will love M1, but it lacks a compelling reason to switch to Apple. Apple already made the best laptops, the M1 keeps that crown. People that were satisfied buying Windows laptops and Chromebooks will continue to do so because the criteria they used when making a buying decision involved other factors than "I want the best".
I expect SoC designs to fracture the CPU market in the coming years and kill AMD and Intel, but unless M1 opens up to other designers and companies its a tangent in the great CPU race.
For the price point, I don't know of any other laptop that can compete with my M1 MBP 13 on battery & performance & temps and sleekness of form factor. M1 got me all of these things in one package, and it is just the FIRST attempt by Apple at making their own laptop chips.
When the next one comes out, probably when they release their refreshed 16" machine, it'll continue to convert people who are looking for great performance/battery/temps without having to compromise between then. The M1 MBP 13 is my first Apple daily driver for these reasons.
It will disrupt. Intel and AMD chips aren’t on their first initial CPU release. Apple is with the M1. The future of CPU offerings from Apple looks very bright. An exciting time for computing, I’m looking forward to it.
It will disrupt. Intel and AMD chips aren’t on their first initial CPU release. Apple is with the M1.
Apple has been designing their own ARM-based SoC for more than 10 years now. They've been outperforming the ARM processors in Android phones and tablets the entire time.
They were first to ship 64-bit ARM processors in phones and tablets.
This is a strange argument, because the M1 is not Apple's first release. I believe that was the A5[0]
To be sure, the chart comparing Apple CPU performance over time in comparison to Intel is damning for Intel (and a reminder that this argument makes no sense.)
That doesn't mean it won't disrupt, but for it to disrupt, any last reason someone previously would not consider an Apple computer has to be removed.
* Absolute performance for high-end multiple CPU core workstations doing massive parallel processing
* AAA demanding games being played at 4K with ray-tracing and 60+ fps
* Freedom to repair and upgrade your own computer
* Absolutely no doubt that your software of choice will work as expected without troublesome workarounds
To be sure, lots of users don't care about anything on the list above, or for their rendering, gaming and software needs, they get what they need from an Apple computer. But the M1 does not change that.
Until about 3 years ago, Intel had the fastest laptop and desktop CPUs, and they were available regardless of your preferred OS. So few were choosing Windows or Linux merely because of hardware superiority. Now AMD is a better bet than Intel for Windows, and Apple Silicon is a better bet than Intel for MacOS.
And some people may have been on the fence before but are perfectly suited to switching to MacOS, and will happily do so to take advantage of the power and battery life of the M1.
M1 is not a first. It's a scaled up evolution of their mobile CPU offering that started all the way back in 2013. Their Bionic line present on iPads was already quoted as being faster than some laptops.
And in 2016 they were already making ARM CPUs with Big and Little cores on the same package just like the M1. [1]
For a counter argument, I don't think M1 is going to disrupt anything. Its a great performer for sure, but its locked to a proprietary and closed ecosystem controlled by a single entity.
You could have said (and many did) the same thing about the iPhone in 2007 and it clearly disrupted the phone industry. It was only available on AT&T for its first few years but people had no problem switching carriers. The major incumbents of the time—Microsoft, Nokia, Palm, RIM, Motorola—are no longer in the smartphone business or are out of business as we knew them.
The M1 MacBook Pro outperforms virtually all other laptops in its price class, can drive multiple (up to 6) HDI displays, can run x86, ARM-native Mac apps and iOS/iPadOS apps. You can edit 8k video as if you were on a much more expensive device and the battery can last 20 hours. For many casual users (email, web, basic productivity apps), there are fewer reasons for them to not make the jump.
Its a new direction for Apple in their laptop architecture, and a solid first step, but the raw performance is not up to the fastest from AMD and Intel…
It's faster than all but the absolute latest from AMD and Intel, cost less and draws much less power. Those processors can only be gotten in high-end laptops while the M1 is in entry level products. None of them has the performance per watt and the battery life of the M1 Macs.
The mistake many HN folks make: with Apple, it's the complete product, not just one thing. Sure, the M1 is faster than 98% of laptops that shipped the previous 9 months (according to Apple's fine print) which is amazing for a first attempt. But when you consider the battery life, fit and finish of the design and the nice little extras (studio quality mics on the MacBook Pro), there's really nothing else that's objectively better.
People already in the Apple ecosystem will love M1, but it lacks a compelling reason to switch to Apple.
I've already seen many posts here on HN that people who either never owned a Mac or abandoned the Mac years ago have bought M1 laptops. Apple usually reports each quarter that about 50% of Mac buyers are new to the Mac; I don't expect that to change. And Apple had a record Mac quarter before the M1 shipped [1].
but unless M1 opens up to other designers and companies its a tangent in the great CPU race.
It's not a tangent; it's a shot across the bow of not only Intel and AMD, but the entire PC industry. Neither Intel or AMD can match Apple's performance per watt and Dell, Acer, HP, etc. can't match Apple's design prowess and bundled software, services and support.
Because Apple's designs the entire widget to work together, they can deliver a better experience and there are people who are willing to pay for that, especially when the starting place is it's really really fast.
> Now you got a big problem, because neither Intel, AMD, or Nvidia are going to license their intellectual property
I keep seeing this line and wonder - "did people miss the news about Nvidia buying Arm?". Sure, there are still plenty of hurdles to pass until it's official, but take a look at the initial piece written by Nvidia about the acquisition:
> Won’t be the first or the last acquisition to say everything will stay the same.
Then change it a year or so down the road, requiring you to have an account on some other non-related service in order for your hardware to work. /sarcasm
An interesting question here is: which of the big Arm customers are going to trust NVidia?
As far as I understand, Arm's license requires licensees to give Arm
detailed and advanced information about implementations (IIRC long before
they are fabbed). This would give NVidia unprecedented insight into the
competition's processors long before they hit the market. Will
Samsung or Apple be happy about that? Will AMD be happy about that?
Given the tension between the US and China, it's almost inconceivable
that the Chinese Arm customers will be happy about this. China is the world's biggest processor market, with a government mandate to reduce dependency on foreign technology!
And as you allude to -- AMD wouldn't exist without a license to the x86 ISA from Intel anyway. So all of the major CPU/GPU vendors have some kind of IP licensing in place. The question is now with Apple as a player, how will this shake up the rest of the chip industry. Will we see more cross-licensing or will existing players retreat and hold their IP closer?
Plus, in some cases it is perfectly fine to integrate technology by soldering a discrete package on a PCB. It is not superior to a SoC in all cases, but neither is the inverse true.
In that sense Intel, AMD, or Nvidia are absolutely conveying their intellectual property to end user products through OEMs.
They have to say that, though, don’t they? They need to convince a bunch of regulators that their purchase of ARM won’t cause too much damage. They couldn’t really say that they would be going to choke a whole industry by dropping the licensing model, even if it were the case.
This feels cynical to say, but I don't know whether to listen to Gassee wholeheartedly or to take his perspective with a grain of salt. On the one hand, he has deep experience with Apple and great knowledge of the computing industry, but on the other, he is an elderly technologist who was pushed out of Apple decades ago [1] and is no more relevant to M1 than anyone else in the industry.
In either case, I tend to disagree with Gassee's attack on the Morehead review, and I don't believe that there is any need to connect Morehead's M1 opinion with his 64-bit iPhone opinion. M1 should indeed be judged on its instruction set limitations because that is one of its major differences versus x86. It would be careless to consider personal computing hardware while forcibly ignoring the sofware ecosystem that it supports.
I also think it's a cheap shot to accuse Morehead of choosing his opinions based on a financial interest in the supremacy of x86.
People on Hacker News are too easily impressed with the performance and the shininess of a new architecture, looking over the severe issues of Apple locking down it's platform and allowing less control over the device by the user. What good is a fast machine with long battery if I can't do what I want on it?
> Will Microsoft now offer an official Windows 10 on ARM that can run on M1 Macs?...This could create trouble in the x86 PC world.
Or will Microsoft, less committed to Windows than it is to its applications running everywhere, be content to see Office and more run (and generate revenue) on M1 Macs…and how will HP, Asus, Dell and others react? This is just the beginning of a new competitive layout.
Why? If everyone gives up and just builds everything on the web then it's an indirect boon for Linux, but M1 will probably never ever properly support Linux.
Unless Apple have a complete change of heart, if it ever does it will be as a result of far too much work done for free
The reality is that some of the drivers for the M1 chip will never arrive, but that if you want to run Linux (or Windows) Arm code then you can do so soon.
The issue for me is that I'd love to have an ARM laptop but I also love playing with the hardware and measuring it, trusting the hypervisor makes that a lot harder.
If I was a web developer I'd probably buy one (not a knock on web developers, if I was any good at it I'd do it), but I like playing in the grey area of the Venn diagram where microarchitecture, compilers, and kernels meet.
I was a little confused by this part because MS is already happy for Office to run everywhere, in browsers. Given the in house expertise that’s given us VS Code I wouldn’t be at all surprised to see an Electron powered Office app suite in the not too distant future. Linux would (and does) benefit just as much as Mac.
I think it will benefit Linux but not that way. It will force the x86 vendors to adopt the microarchitectural aspects of the M1 that make it fast in practice, instead of spending so much area/power budget chasing SPEC benchmarks. We will get better chips through competition.
It will force the x86 vendors to adopt the microarchitectural aspects of the M1 that make it fast in practice, instead of spending so much area/power budget chasing SPEC benchmarks. We will get better chips through competition.
There's only so much you can do with a 40-year old instruction set. There's certain things neither ARM or Intel can do compared to the M1.
Example: the max number of instruction decoders Intel and AMD can use is 4 and that brings tons of complexity due to x86 instructions ranging in size from 1-15 bytes. There's no way for the processor to know where the instruction boundaries lie except through trial and error essentially.
All ARM instructions are one size, so decoding them and having many instructions in flight at any given time is much easier. The M1 has 8 instruction decoders.
Short answer: the M1 can process twice as many instructions per clock cycle than Intel/AMD processors can.
The M1 and the Zen3 run neck and neck, with the Zen3 being slightly faster in some benchmarks.
But the Zen3 runs at 5 GHz and the M1 runs at 3.2 GHz, meaning if the M1 ran at 5 GHz, it would blow the Zen3 away. And of course, on a performance per watt basis, the M1 is the obvious winner.
This article gets into all the reasons why Intel and AMD are between a rock and a hard place when it comes to competing with the M1:
It sounds like you are just regurgitating some blog post you found. I don't have any workloads that are decode starved on x86 so it wouldn't make any difference if Intel had 50 decoders. I do have tons of workloads that spend a lot of time on TLB misses. I have several that are cache fill limited.
It’s not about being starved for anything; it’s that Intel and AMD have legacy issues they can’t get around. They’ve pretty much out of bullets: more cores, higher clock frequencies, better manufacturing, etc.
The point they can’t process as many instructions per clock cycle and there’s nothing they can do about it. They can’t match the M1’s performance per watt.
And certainly there’s no Intel/AMD laptop processor with 8 or 16 Gb of RAM via a 128-bit memory bus on the same die. Same thing with a 16-core Neural Engine with an OS with the APIs to take advantage of them.
Intel's current generation CPUs have the same dual-channel LPDDR4X memory that Apple uses. Memory is not "on the same die" for either company because you don't make DRAM on the same process where you make CPUs.
The grandparent used the wrong term. The memory isn't on the die, but it is in the package (MCM). I don't think that provides faster throughput, but it should help latency.
Regardless, the big win was giving unified access to that memory using a memory fabric. That lets the GPU (for instance) access graphics assets with no copying - it just uses them in place.
AMD can absolutely do the same thing, if it desires. Intel can too, but it has a hill to climb WRT GPU performance.
I don’t think these goals are exclusive. I am not sure about the M1, but the A13 and A14 are doing quite well in SPEC benchmarks. If they have compromised these scores, it’s not obvious to me.
I don't think they've compromised on SPEC performance at Apple, but I think they've chased it too hard at Intel. I'm sure I'd rather have more iTLB entries and more L1i cache than that third 512x512 FMA unit, if that's a trade that can be made.
Sorry I misinterpreted. Isn’t this a reflection of the primary markets for both companies, though? When Intel is trying to sell chips for HPC clusters and expensive workstations, impractical vector units can make sense. Clearly Apple is not bothered by that and has a more balanced micro architecture for more common use cases.
Second, not third, and chasing benchmark scores (instead of real workloads) would result in exactly the trade off you’re asking for; most pieces of SPEC will barely benefit from the second AVX512 FMA at all.
Office is already running on Mac OS. M1 does not change anything in that regard.
So the only meaning that remains that would make sense is: how would HP, Asus, Dell, and others react if MS sold non-OEM licenses for Windows 10 for ARM.
Given for now I suspect they don't ship a lot of ARM laptops, that boils down to; would the M1 hardware be unfair competition.
I don't think so. MS x86 emulation is likely inferior to what Apple is doing for now, and I'm not sure if/how quickly they could make use of the M1 specific capabilities if they even wanted to. So at least the market won't shift quickly.
Is there already an in-depth analysis of the M1 by someone with a low-level perspective? I want to develop an intuition on why/where the M1 is fast. E.g. is it branch prediction, reordering, memory latency. Unfortunately, most articles don’t go any deeper than “seems to be optimized for macOS”.
The biggest innovation of the M1 ecosystem is the unification of software ecosystem, battery life and performance for phones, tablets, laptops and desktops.
Many iOS apps can now be run on MacOS via IPA files, e.g. native Slack iOS client app instead of bloated and memory-hogging web/electron client.
For the long-suffering power users of iPads, a Macbook Air offers most of the benefits, plus a proper terminal and toolchains. With early virtualization support, Linux can be run in a VM.
Plus all-day battery life and proper docking support for external peripherals. This has the least compromises of any device in the last few years. MBA has a good keyboard, smallish bezels, ESC key, TouchID for auth and is relatively affordable with same weight and less cost than iPad Pro + Magic Keyboard. MBA is now the perfect "thin client" that also works offline.
As an example: the usable and powerful iOS video editor LumaFusion is now available on M1 MacOS via app store. It costs a fraction of traditional video editors and works well. Looking forward to a new class of prosumer apps that can raise the bar for both iOS and MacOS.
>Many iOS apps can now be run on MacOS via IPA files, e.g. native Slack iOS client app instead of bloated and memory-hogging web/electron client.
I expected iOS apps on the Mac to be absolute trash (and they are... at least by Mac standards), so there's really something to be said about the state of the web when the iOS variants of these apps (running on a Mac) still provide a better UX than the web variants.
I know when Twitter for Mac came out (another unoptimized iOS app running on the Mac), I installed it and never once looked back at their bloated web client. And I know that if I could get the apps for Gmail, Youtube, Netflix, Reddit and other popular websites on my Intel Mac, I'd likely never go back to their web versions either (seriously, Apollo for iOS is so much better than the Reddit website).
The open web is important. But until the web provides a UX of comparable quality to native clients, users will understandably flock to closed app sandboxes when given the opportunity. Great web apps do exist, but us haphazardly bolting on framework after framework is killing the performance and UX of the web. Sticking our heads in the sand about this isn't doing the open web any favours.
Well said! And it really is all-day battery life, too! I myself got the MBP 13" and I have put it through its paces ever since I got it. The battery life, temperature, and performance are amazing. Can't wait to see what they do with the M1 performance variant they'll be putting into the 16".
Its only disruption if it disrupts. No one cares other than existing Apple users so I guess the only disruption will be the year of really fast and efficient teething issues they will be dealing with.
We have one at work, I'd highly recommend waiting another year before getting one if you want to be productive.
If this was just a new CPU option with a performance boost over Intel (and significantly better thermals) but a compatibility mess - I would agree.
But MBP is 500$ price drop from the Intel variant. At these price points these laptops are a far better value proposition than the competition - Intel thermal performance and power consumption is just terrible, AMD is better but not present in anything premium - in terms of value for money - anyone looking at a premium 13 inch new laptop right now should IMO get one of these devices unless they have a really good reason not to (and admittedly there are plenty of those).
I really hate this situation because Apple is not the company I want to buy hardware from - they really suck on giving options, it's all "Apple know what's best for you" from them and their cult (for example I want a laptop with a touch screen, and ideally a 360 hinge, but there is no touch on MacOS because Apple). I would pay double for X1 Yoga with M1 class CPU. But right now Intel has dragged down the x86 laptop market down for soo long, and is still the only game in town for competition, it's just sad. AMD has good chips but no good devices. Modern ultra portable laptops are throttling, overheating and die instantly or have no work performance - it's just terrible tradeoffs for professionals.
AMD is in Surface laptop 3, and in Thinkpad T14 (supposedly, though the ship times are reputedly several months.) The T14 is premium in a certain class (high value placed on swapping parts.) I suspect there are a few more.
Surface Laptop and T14 are not premium ultraportables, Surface Pro and X1 are - and unfortunately those don't even have AMD CPUs in the upcoming leaks (I would buy one instantly if it came with 5xxx series AMD).
> Anyone in the market for an ultrabook will care.
My current ultrabook is 12 years old and falling apart, but the new Apple designs aren't even in the picture as replacements because they're simply not general-purpose computers. They are appliances.
Yeah but the end user looks at a 60-80C 6-8 hour battery general-purpose computer and a 30-40C 18-20 hour battery appliance and one of them is clearly better for the purpose that they use them for.
The problem of Microsoft's failed attempt could be resumed in few points
- They failed to provide a decent ecosystem
- They wanted to carry on decades of legacy bits
- They failed to provide fast hardware
- All that resulted in a shy attempt with poor performance
Apple didn't have all these issues, because it is in their culture, to provide software that works, without much bloat and with great hardware
Them forbidding JIT for desktop applications helped them in the perf part, everything feels snappy, because nothing needs to be recompiled each startup
> Apple didn't have all these issues, because it is in their culture, to provide software that works, without much bloat and with great hardware
It's funny you write this when the linked article specifically calls out Apple's software for a few "warts" and even agrees with Moorhead's and Gruber's take that the software is not yet fully baked.
A credible ARM response will probably come in about 1-2 years from Intel/AMD/Qualcomm/etc.
This is the second major opportunity for Linux to make an actual push onto the desktop.
Windows 8/10's abhorrent design was left at the roadside, but arguably there was no motivation from big backers to get Linux up to par on desktop.
This time may be different.
Why would AMD/Intel/Qualcomm wait/hope/gamble that Microsoft executes the transition? Especially given the major shortcomings in UI? The architecture transition will destroy Microsoft's compatibility advantage.
Why not hedge with a really well supported Linux Desktop alternative they personally/collectively fund and push... personally I think that's corporate malfeasance given the scale of revenue to not pursue. Or a BeOS.
The architecture switch will probably limit the hardware that needs support. A linux-based or at least OSS_based x86 emulation somewhat on par with Rosetta will be a critical component as well.
I am looking at two compatibility points regarding M1-macOS. First, will VMWare (and Parallels and virtualBox) run x86 Windows VMs on M1 and second, will Linux run natively on M1 (even though I don’t do much Linux). Until these two things are taken care of, M1-MacOS will just be (an important) niche system.
I do give Apple credit for a terrific design. It would be interesting to know how much of the performance is attributable to the smaller process size, on-package memory, etc. Things others will be able to replicate. It's not clear to me how much of it is true competitive advantage.
I would assume that you'd have to try and scale for what ARM allows them to get away with (if you will) - e.g. ARM allows wider decoding, while also taking into account the increased density allowing you to get more bang for the same buck (so to speak)
If you compare the A series chips Apple has been designing and compare them to the Snapdragon line, it's pretty clear that process size doesn't matter much; the Android phones have been trying to catch up for 7 generations and still fail.
> Will Microsoft now offer an official Windows 10 on ARM that can run on M1 Macs? ... and how will HP, Asus, Dell and others react?
I guess Windows will come to the M1, including natively. I confess that I see the M1 as a "sustaining innovation" rather than a "disruptive innovation" using terminology that Christensen [1] (and Dediu [2]) use.
I actually used whatever was pre iPod - I think it was a phone I loaded your favorite 20 songs on? Either way, for all the talking down of the iPod when it came out - for folks who wanted to listen to music it was total magic. I bought the first iPhone too, and didn’t have a clue that it wasn’t a good email device. I still use only windows / Linux server side - but this new “gimmick” of the M1 does have me interested
There were more than a couple MP3 players before the iPod. I had a cool little device from Sandisk with, I think, a 512MB solid state drive? I bet it's still kicking. That thing was way cooler than an iPod, it ran off a AA battery (I think) and it seemed pretty tough. I dropped it in a slushy road puddle and it just kept trucking. The iPod had good capacity but the spinning drive in there was a huge failure point.
I remember (at about the same time CDMA cell phones began getting popular) a standalone device with 32MB of storage, enough for like 20 songs or so. I think this was the one:
I loved my Sony Ericsson phones. With a decent memory stick and headphones it felt like the future. In a way it was, but there were many limitations and I jumped to the iPhone one year after it became available without regrets.
There's a lesson for all of us to not leap to conclusion.
In the case of the Apple iPhone I didn't see it and only boarded the train by iPhone 4 (and honestly, I think it was about the right time). In the case of M1 I _did_ expect a revolution and frankly one I had been waiting for for decades (the true cost of x86 is finally exposed). I can't wait to see if Apple can keep up the momentum.
They've reinvented x86 at least twice: microcode and then x86_64. I wouldn't bury x86 yet.
Plus a good chunk of the M1 computing prowess is not even the CPU, it's the additional hardware such as the neural engine cores and such. This is something you could put next to a Sparc core, it's not ARM specific.
There is an fundamental complexity in the semantics of x86_64 instructions that makes them expensive to implement. I worked for Transmeta and am at another microprocessor company today so I do have 1st hand experience.
There are other things in M1 that helps make M1-based systems fast. I'm not talking about that. I'm exclusively referring to pure-core performance. It is true that the latency improvement and high-bandwidth of the nearby DRAM does help here, but it is far from the major factor.
FWIW, there's no evidence that TSMC's N5 node is the major factor, beyond letting Apple spend a lot of transistors.
Intel/AMD have so far been able to overcome the overhead with lots transistors and power, afforded by leading edge nodes. Apple is now at the same playing field and especially the power efficiency difference is obvious.
Nah, this is it for x86. The only relevant piece of x86 binary software is the Windows ecosystem ... and that runs on arm now. Everything else is either interpreted (Python, Ruby), JIT'ed (Java, C#) or can be compiled natively to arm anyway. All Apple have done is prove that it's possible.
However I'm doubtful that the PC market will move to a closed system similar to Apple's. So whoever comes in with ARM better be prepared for lots of competitions and low margins in a cutthroat environment. That's not a very attractive market proposition, especially since the x86 makers won't just take this lying down.
I think we'll be in a weird place soon.
Apple will be doing Apple things and increasing its market share but never taking over (maybe it will reach 20-25%? who knows).
Many server environments will switch to ARM, especially the big cloud providers that can make their own chips.
Qualcomm will push some more desktop class SOCs but I doubt they'll reach Apple's performance levels so they risk having their lunch eaten, especially by AMD.
So it's possible that the PC desktop/laptop market might be "stuck" with x86 for many more decades.
There is also a scenario where Intel or AMD come up with some brilliant designs and catch up completely or well enough to Apple in which case this will all have been a tempest in a teacup.
>and honestly, I think it was about the right time
Why so? I got the second model (3G), and if I could, I'd had gotten the first. It was already a giant leap from the smartphones of before (and I had the "flagship" models from other companies that were shit compared to it).
I didn't see it because I didn't really use my phone. It was used entirely to phone taxis, order pizza, and as a pager-of-sorts for the office to let me know when it was Overtime time.
It didn't "click" until I had to replace my ipod, and got the ipod touch. It didn't take very long for me to realise the one thing it was missing was connectivity.
Realistically I'm not even sure I was wrong. Calls & SMS are amongst the functions I use the least on my phone. It's still an ipod with connectivity.
Where are some benchmarks comparing M1 to the best x86 processors from AMD? Since M1 is only 4 big cores maybe it would be fair to compare it to the 4 core 8 thread APUs and not crush the poor thing with an 8 core version.
A1 is cool. Good for Apple, but it's not the computing miracle it's made out to be.
Who has the best CPU is irrelevant. What is the relevant is a single driving force that can deliver a coordinated response to Apple as a sinlge, inseparable package and not a shoebox of DIY parts. And the trouble is that the PC industry does not have such one.
Back in the 80s, it was IBM being the PC product company with a single product vision and being the trend setter (PC was hardware and software in one package designed to the make the most of each other), and the last failed attempt by IBM to lead the industry was the introduction of the microchannel architecture that no one wanted to licence due to high royalties. After that IBM gave up and became just another PC vendor. This is why we have the PC world as it has been for the last 30+ years (democracy) vs Apple (benevolent dictatorship).
Different players (Intel/AMD, Microsoft etc) have different stakes in the same game (or hold multiple pieces of the same puzzle – depending how one pleases to see it), and they may (or may not) have a busines incentive to play nicely with each other. One potential solution could be completely opening up the x86 architecture and giving everyone equal rights to the IP pool so the situation such as that one of the microchannel arch royalties does not come up again (i.e. Apple business units does not have to licence technology components from ot to each other). But then, again, a single driving force would still be required to deliver a single product. Perhaps, a new contender could then emerge.
And an interesting thought excercise is that one of the PC market deciding that it does not need a disruption, and it will remain content with the status quo and just soldier on. Which is not entirely implausible, either.
You’re right, it doesn’t stand up to the multi core performance of the absolute top of the line Xeon chip. But that chip probably cost more than the entire laptop with an M1 in it.
Apple started at the bottom of their lineup. Given the performance of the processor compared not only to the processors that it replaces but to others higher in their lineups, it’s extremely impressive.
Is it the absolute number one best in every metric on the planet? No. But you shouldn’t expect that part in a $999 laptop.
This was their first try, in the low end. In six months (or so) when they release stronger Macs with a different in one (or M2) we’ll see how well they scale it.
I keep seeing this divide where people that are used to Apple products using Intel chips are only capable of comparing the Apple Silicon M1 to Intel products.
Here, in response to someone discussing AMD processors, the Intel Xeon is raised as an example.
Sorry, no - the Intel Xeon is not an AMD processor. To be sure, the comparison between the APUs found in $999 laptops should be compared to the M1. The AMD Ryzen 9 4900HS, for example, is a 35W part though that's more of a $1200 laptop part. The Ryzen 7 4800U and Ryzen 7 4800H are both found in $999 laptops though. There are definitely cases where the M1 wins (many single core benchmarks, and efficiency) and other cases where the 4900HS, 4800H and sometimes even 4800U wins (several benchmarks that take advantage of the 8 cores/16 threads, and the higher TDP.)
You don't need to go to the high-end server chips, like the 128 core EPYC to find competition for the M1, but you do have to take a step outside of the Apple ecosystem that only has the old Intel and the new Apple Silicon.
That's not how things work in the Apple world. Anything Apple release is disruptive and innovative even if its been done before. Just ignore it and continue on in reality.
And by dumping x86 and making the switch to ARM, Apple really has a lot riding here on its ARM processor. So we can understand the hint of desperation in pushing and publicising the M1 so much, especially targeting the Christmas shoppers. I have mixed feeling about this hype for the following reasons:
1. ARM processors have been ready for the desktop markets for a really long time. Hell, if the buggy and resource heavy Windows 95 / 98 could run "smoothly" on a single core 200 Mhz Pentium MMX, today's faster and multi-core ARMs have the potential to do much more with the right software. And that's exactly what has been holding ARM processors from the desktop market so far - decent system software and applications for it. It is in this area that Apple really deserves commendation for designing an ARM SoC well optimised for their system softwares (the macOS / ios platform).
2. With a well-designed SoC, and a decently optimised OS, it is not surprising that that the M1 is able to compete and match performance with the current Intel and AMD chips. The lower power performance and longer battery life is just an extra icing on the cake that ARM is already famous for. Apple is doing what it does best - utilising its strength of integrating the software and hardware to get the best possible performance out of a device.
3. Competition is good for us consumers. But for those of us like me, who prefer "freer and open" systems, I'd place my bet on Intel and AMD in the long run. They still have a huge advantage over other processors in not only raw performance, but portability and backward compatibility of software on their system. Hell, this is one of the reason why many of us, hesitant of Apple's famous obsessive control over their devices, were swayed to purchased an Intel mac - if Apple turned the screw on us, we could theoretically turn away from it by switching to another OS.
4. And as the Intel and AMD competition has shown us, both have bested each other in the past at some point, and continue to do so. As Apple's past switching of CPUs and architecture shows, even they recognize that marketing can only get you so far and people won't pay a premium if the hardware isn't atleast equal to the competition. In a year or two, both Intel and AMD will have even better processors. Will Apple be able to sustain competing with them in the long term will be interesting to watch for.
5. As I mentioned before, Intel and AMD's huge advantage over ARM processors in general, and here specifically with Apple processor, is that they are more "open and free" when compared to the ARM platform that are a jumble of closed and customised SoCs. We all know Apple's nature here - they are extremely obsessive about controlling the device totally, and least interested in letting their users customise it outside the bounds they set (hardware or software). A highly customised and closed ARM SoC gives them precisely that kind of control on the desktop that they have been desiring for a long time. The closed ios platform has proved to them that they can earn RECURRING revenues from such devices - they earn billions of dollars from the App Store. Apple's desktop processors now allow them to do the same with the mac Desktops.
And that's why as an Apple user who owns a mac and is typing this on his iPad, I don't celebrate the Apple processors, or their temporary "game changing" performance. I don't want to live in a gilded cage, deprived of my computing freedom.
And I know it doesn't matter to Apple - it's still a huge profit for Apple even if they lose 100's of thousands of users like me because they'd rather have millions of non-tech savvy users who see a smart-phone or a tablet just like their TV or microwave - day to day appliance whose working they don't understand but are glad to have. Apple can make more money from these consumers than from us.
It was a sad day when we developers had to pay Apple for the privilege to develop and distribute software on it. It was disconcerting as a user to pay an "Apple tax" when you are not even buying the software from them. It is depressing now that Apple processor has tightened this stranglehold even further now, with the push to convert a computer to a calculator like appliance that will run only Apple authorised softwares. I celebrate the innovations but have no reason to celebrate Apple's role and only hope it fails. Its success will result in further imitation of Apple's business practice and pushing of such closed ARM SoCs by others in this industry - the future of fragmented, incompatible, custom and closed ARM processors is not something I look forward to at all as a developer or a user.
An easy way to understand the M1 is that Apple simply "iPhone-ized" the PC (or the Mac). Instead of having the modular architecture that PCs are known for (motherboard with separate CPU, RAM, hard drive, GPU, and chips for ports etc), everything is mashed together. This gives the best performance but very little modularity and zero upgradability. It's great for Apple but kind of shuts out all other ecosystems. It's unlikely anything but macOS will ever run on M1 since Apple won't open up the specs, and why should they? It is also another step into the direction of total throw-away culture.
As a longtime Apple user I am shocked to find out that nobody cares about telemetry malpractice in Apple software. If M1 is future of Apple computers it reminds me of era in the past when running a mac was brilliant idea but software ecosystem was absent. Actually move to intel chip has widen Apple adoption as a platform.
There are two milestones in this transition. Hardware and software. On a hardware and optimisation of SOC Apple has delivered significant result. Let see how software adoption part will happen and how third party non App Store apps will perform in this transition. Let's not forget lessons from the past. Hardware is vehicle for running software and iPhone and iPad apps are not desktop software alternative.
From the published measurements it compares favourably to some mobile cpus.
It doesn't look particularly good compared to existing high performance CPUs (either in peak perf or perf/$).
If having more computing power was so transformative for you (certainly I consider fast computing transformative for me...)-- it was already available, you didn't need apple's permission to use it.
Maybe I've just been running dual 24+ core cpus for too long (thank you engineering samples) that I've lost perspective at how far behind typical desktop users are.
Its a massive improvement for non gaming laptop users. Most laptops in 2020 overheat constantly and are slow and loud as a result.
The non M1 macbooks are next to unusable because they will burn your legs just running safari. Now they have reached a higher level of performance without even having a fan. Not hard to see how this is pretty huge news.
The M1 Macbook air seems like the best laptop for the average person by a fairly long shot right now.
Laptops are Actual Computers and while it's nice to have ssh access to my home box or something even beefier in AWS/Azure, I do like the idea of having lots of performance with great battery life and low temperatures all in one sleek package. (It also plays nicely with my iPhone which is super cool)
perf/W is what matters for laptops. I'm right now on an RTX 2070 Max-Q with 6/12 i7 Intel laptop, but I'm paying for it by it being heavy on my travels (especially the adapter for it taking up a big part of my suitcase).
I'll see in half year how Zen 3 compares to M1X as this is the first time I'm thinking of buying an Apple laptop for myself.
> As pointed out in John Gruber’s hard takedown of Moorhead’s piece, most of the problems he encountered were software problems, not M1 hardware issues.
Morehead's allegiances and work are relevant biases, as are Gruber's. But Morehead's review wasn't of the M1 processor or architecture as a technology. It was a review of the 13" M1 MacBook Pro as an end-user device.
Software problems _are_ worth raising in a review of who should buy an M1 MacBook, and why someone shouldn't.
Because of the architecture change, the M1 has to come up — to omit its role in requiring Rosetta would be lying through omission. It's not an indictment of the technology, and shouldn't be controversial, for an end-user review to say the first OS revision targeting the first generation of a piece of hardware maybe isn't ready for every task thrown at it, especially those that rely on the newest of the new software components. It's why my workplace isn't even mostly on Catalina yet, much less Big Sur.
For the love of whatever stupid internet points people are trying to win, let Morehead's review go. It's contributor report on Forbes, which makes it only slightly more relevant than a Reddit post. Folks can absolutely find better things to squabble over about these computers.