I want it, but I'm not getting it until all the hardware works under Linux.
MacOS is great, but someday they will stop updating MacOS for this. I see it as a device with incredible longevity because of the fanless thermals and how it sips energy. I could see myself using it 10 years later. I want to run MacOS or Fedora Silverblue, with Silverblue being my true love. Immutable OS images <3
HAH, downvoted in seconds. Would not be surprised if there are Apple employees brigading this.
I don't think you're downvoted because of fanboys.
I think you're downvoted because:
1. Linux is coming to Apple Silicon sooner or later. You can buy now, enjoy macOS, and then install Linux later.
2. It's not reasonable to expect to use it for 10 years. Even if Apple stops supporting it, you can still use it. In addition, Macs have insane resale value. So you can sell it in 5 years, and then buy a new one. This is a more reasonable approach than yours.
3. Apple tends to support their laptops for a long time. Usually 7-8 years. And given that this is their own chip and it's extremely powerful, I can see it being supported for 8 years at least.
1) We are now 2 generations in without full Linux support for the hardware.
2) I used my Powerbook G4 for 12 years; this is mostly because I was a kid with no money. When I got something else, my sister used it for another 3 years. With thermals as they are in this device, instead of selling it after 4-5 years I'd rather keep it for one-off projects as a server like a Mac Mini. I know laptops aren't designed for server work, but I love that it's a server with a builtin terminal. Also a device I'd use for hiking, because I could charge it from backpack solar.
3) The worst experience I've had was telling my dad MacOS Catalina couldn't be installed on his $3.5k iMac.
As you said in your other comment, M3 will be 3nm TSMC? Maybe Linux will look good on Apple Silicon then.
You have pretty good support except for the GPU at this point (admittedly I'm waiting for this until I install it on my machine). They've had to write all the drivers from scratch, so it was bound to take a while. But on the plus side the hardware interfaces seem to be stable across generations. M2 parity with M1 was implemented in 48 hours! So I'm pretty confident it will mostly be case of linux support continuously improving rather than resetting with each new generation.
The GPU is important.. afaik the keyboard and trackpad aren't working yet in Asahi Linux.
I don't expect Apple to support the Linux community. It feels like this is trending in one direction. It felt like things stopped being "favorable" to us when they stopped supporting OpenGL and made no effort with Vulkan. The touchbar and some wifi chipsets were poorly supported for years before M1 debuted.
What has been implemented in Asahi is impressive, but it's not ready to be a daily driver. I hope this becomes my mobile device of choice someday. I'd use MacOS for work-work, and I would choose Linux every time for fun-work. So tired of everything having telemetry and vendor lock-in and basic pieces of software moving to subscription models.
Apple laptops became the dev machines of choice because they embraced the OSS community in pretty big ways. Right now the water feels tepid.
> Apple laptops became the dev machines of choice because they embraced the OSS community in pretty big ways. Right now the water feels tepid.
You could triple boot Windows, MacOS and Linux with Intel Macs. A Mac was a great choice because you could develop for and support all 3 of those platforms (plus Android and iOS). Aside from having bad GPUs, expensive storage, and little modularity, they were great machines for development.
Now, with no Linux or Windows support, not so much (unless you don't need to use anything but MacOS.) Unfortunately, if you need to support MacOS or iOS, you don't have much of a choice. Just really unfortunate that what used to be possible with one machine may soon require two. So long as Apple supports any Intel Mac, many Macs will be able to get OS updates thanks to the OpenCore Legacy Patcher [1]. I'm not sure when Apple will drop Intel Macs though. The last Mac to use an Intel processor released in 2020, so there will hopefully still be quite a few years left of support for Intel Macs for the time being.
Windows and linux both work in VMs on Apple Silicon macs. I suspect the support for bare metal will come in time (definitely for linux, windows I guess we'll have to see, but I wouldn't be too surprised).
This is something I'd be comfortable with maintaining, but I couldn't leave with my dad. (I work on the road for months, and it's unpredictable when I come back. Longest I was out was 2 years..)
> 2. It's not reasonable to expect to use it for 10 years. Even if Apple stops supporting it, you can still use it. In addition, Macs have insane resale value. So you can sell it in 5 years, and then buy a new one. This is a more reasonable approach than yours.
Why is that? I still use my 2015 15" MacBook Pro which released 7 years ago, and I have no intention of replacing it anytime soon. This isn't the 70s, 80s, 90s or early 2000s: the improvements in performance from one year to the next are far more modest and the utility of further improvements are far less impactful than they once were. My 2015 MBP is still as capable of browsing the web, writing code, and video editing as it was back in 2016 when I got it. It's not in any way slow doing any of those tasks.
The performance of an M2 MacBook would undoubtedly be much better, but am I actually going to be able to browse the web faster or write code any faster? Probably not.
A new 16-inch MacBook Pro with 1TB storage emits approximately 620kg of CO2e in its manufacturing process (The Carbon Footprint of Everything by Mike Berners-Lee, 2020 second edition, page 140). Why should someone emit more than half a ton of CO2e for a new laptop unless they really need it?
If you have a Mac that no longer gets OS updates by Apple, as long as it was released within the past decade you should be able to use OpenCore Legacy Patcher to update it to macOS Monterey (and later Ventura when it releases) without much issue. If it's older than that, you might run into some issues, but generally speaking everything from late 2008 and beyond is supported, with everything from late 2012 and beyond being fully supported.
> The performance of an M2 MacBook would undoubtedly be much better, but am I actually going to be able to browse the web faster or write code any faster?
I split my coding time between a 2017 retina MacBook and a 2018 Mac Mini. Code compiles very much faster on the Mini -> faster iterations. So yes, you may literally code faster on a faster computer.
I switched to surface book when mac jumped the shark in '16.
Went through a few generations of that, it was "ok".
Bought the cheapest 13" M1 I could because I had to debug some stuff on the iPhone, I was pretty angry about it.
That day I switched over to it as my daily driver.
The performance absolutely blew my mind.
Stuff I'd wait for one to two minutes to build on the SB would build in a few seconds on the M1.
Debugging + video call + building went from crushing my laptop to being not really noticable.
I was using it heavily for two days and realized I forgot to plug it in. Was only down to 50% battery.
I'm very far from being an apple fan boy, but this little laptop has completely blown my mind. Simply the best hardware I've ever owned, and makes me easily 2x as productive.
And that's all with the cheapest 8 gig of memory model.
If you're wanting to use your laptop for development work, you're going to pick the best job for the task. Moving to OSX would not only slow things down due to how slow non-native Docker is (no it is not negligible, yes I have an M1 Mac and I have tested it in a side-by-side comparison), but you have much less control over your environment compared to Linux.
The vast majority of the people I've worked with choose to use Linux laptops over Macs, and I don't think CPU efficiency is something that's going to get them to change. While working from home, I can think of zero scenarios where I need more than 8 hours of battery life, which my x86 Intel laptop already exceeds.
I think we're getting faster at iterating on bringing up dev environments.
Silverblue is my favorite, but it's becoming common for me to develop everything within a docker image. As quickly as we're committing to a project, we're updating the env and rebuilding that image at the same time? I'm new to this.
I have a friend who's really big into k8s and ansible. Right now it feels like I'm toying around in 1 pod. He can bring up a set of services around the thing he's developing in a few minutes. I want his power. :x
It's called docker-compose, and it's really simple compared to even a 1 pod k8s. Or maybe I'm misunderstanding what you mean by bringing up a set of services.
Yeah seconding the recommendation for docker-compose, it's a great way to start up multiple services and once you know the syntax, pretty straightforward to use in my limited experience. You can essentially declaratively define which other Docker containers you want to start up (in which order), which volumes they can use, and which ports should be exposed.
All of my work goes into (local/alpha/beta) production linux box that's why the mac is technically just a display.
And when I say local, I refer to a linux server, the mac connects to it. the macOS terminal does a nice work doing that, don't think there would be any difference in that regard if the laptop was running linux.
The new mac checks all of the hardware boxes that I wanted. Even occasional gaming if I felt like it.
> It's not reasonable to expect to use it for 10 years.
Why ever not? I have 14 year old laptops that run fully up to date Linux just fine. Why can unpaid volunteers do better than one of the most profitable companies on earth?
> So you can sell it in 5 years
That just makes the looming end of support someone else's problem, it doesn't actually solve anything.
I just checked several online auction sites and my 10 year old 2012 quad core Mac Mini can be sold for about $300 while my 5 year old 2017 retina MacBook can be sold for around $500.
That is insane.
* values converted to USD. Prices may differ in countries actually using USD.
> Please don't post insinuations about astroturfing, shilling, bots, brigading, foreign agents and the like. It degrades discussion and is usually mistaken. If you're worried about abuse, email hn@ycombinator.com and we'll look at the data.
I'm a current Linux Desktop (plasma) user, who is really tempted by this machine, given my 6 year old thinkpad is showing it's age. I'm not sure I can get used to MacOS's window management, even though the rest of the OS looks great. I'm definitely not a power user on linux (mostly sticking to defaults), but things like not being able to adjust the green button to maximize a window? Ugh.
There are a number of utilities to improve window management on MacOS. I find rectangle[0] works reliably, without trying to do too much. Easy keyboard shortcuts to maximize, snap to 50% left/right, or quadrants.
I use rectangle (app) for snapping windows around. I think CMD F will full screen just about every app, if not it’s another keyboard shortcut. You can also assign your own shortcuts in the keyboard preferences.
That's an important distinction, but I think it's getting more common to post for positions you have filled so there are resumes available if someone takes flight. Resumes don't mean someone is available either.
Sure. Would you agree that this is something a JS-centric company is more likely to do (than a java one) because JS engineers are on average younger, and their life is more volatile, and also because the company itself is younger and more volatile?
I'm not saying it's a good idea to give everyone a gun, but I do like the argument that the disadvantaged have the same opportunity to pull a trigger. I hope humanity learns wisdom as quickly as we innovate. We've made it this far..
Where anyone can get a gun? If you live in the middle of nowhere, lots of people around you will have guns (though they could be dozens of miles away in the best case), and obviously where there are more people the police can get guns, or call up the military if they can't for some reason (what country is that though?)
An authorised firearms officer (AFO) is a British police officer who is authorised, and has been trained, to carry and use firearms. The designation is significant because in the United Kingdom most police officers do not routinely carry firearms, although they can be equipped with tasers.
In 2019/20 fiscal year, there were 19,372 police operations throughout England and Wales in which the deployment of firearms was authorised and 6,518 firearms officers, 4.9% of the 132,467 active FTE officers. Following the November 2015 Paris attacks it was decided to significantly increase the numbers of armed officers, particularly in London.
I'd like to think these ideas have been around and well-known for decades, and we're starting to see the hardware rise up to support them. If you told someone in 2001 that Linux would be rewritten in say, Java, with a garbage collector it would have been dismissed as starkly inefficient. Nowadays we have hardware that makes certain safeties negligible on performance (memory tagging, bounds checking, etc). Nowadays we can make an OS out of Java and it will stand just fine. Or Rust. Faster hardware means we run slower (and safer) software. Not everything gets isolated into an ASIC, of course.
It would still be dismissed as starkly inefficient, because the problem isn't technology, rather mentalities.
Midori powered Asian Bing as proof of its capabilities, and yet Windows team was dismissal of its capabilities and instead went on reinventing .NET with COM, aka .NET Native and C++/CX alongside WinRT. A kind of ironic given that it was .NET original design (codename Ext-VOS), before they decided to reboot COM with a managed runtime, alongside the J++ issues that caused COOL to become C# instead.
Quite interesting feedback from Joe Duffy how everything went down,
> OS and tools for building dependable systems. The Singularity research codebase and design evolved to become the Midori advanced-development OS project. While never reaching commercial release, at one time Midori powered all of Microsoft’s natural language search service for the West Coast and Asia.
Ironically, despite still having the Linux kernel underneath and enough C++, Android and ChromeOS are probably the mainstream OSes that are closer to the overall idea, at least in what concerns userspace applications.
As on the Microsoft side, WinDev seems to always sabotage those ideas, as you can infer from Joe Duffy talks, and from Apple side although Swift is supposed to play a major role going forward, C, C++ and Objective-C still represent the majority of the stack.
>> The Singularity research codebase and design evolved to become the Midori advanced-development OS project. While never reaching commercial release, at one time Midori powered all of Microsoft’s natural language search service for the West Coast and Asia.
> Ironically, despite still having the Linux kernel underneath and enough C++, Android and ChromeOS are probably the mainstream OSes that are closer to the overall idea, at least in what concerns userspace applications.
Singularity uses a proprietary language-based microkernel. Midori was allegedly an attempt at a commercial version of Singularity, and I'm not sure what is different about the two, but Midori also uses a microkernel. Linux famously uses a monolithic kernel. Also, consider, any operating system that uses the Linux kernel... is Linux by definition, though technically Linux is the kernel, GNU/Linux is the OS.
There is a piece of container management software, I think that's what it is, called Singularity, and it uses the Linux kernel running on Linux. Maybe you were thinking of the wrong Singularity.
Yes Singularity uses a C++ based microkernel and everything else is written in Sing#, what is your point?
Midori has nothing to do with Singularity in architecture other than being a second attempt from the same group of researchers at a memory safe OS.
Android and Chrome OS aren't Linux, they use a highly customized Linux kernel, in fact after Project Treble it is so customized that it could almost be considered a pseudo-microkernel, as all Treble drivers exist as user processes, use Android IPC to talk to the kernel and since Android 8 all drivers must be Treble based (Android considers the Linux kernel drivers as "legacy drivers").
Finally I don't see how Singularity and the Linux kernel have anything to do with each other, my examples with Android and ChromeOS are what I mentioned in relation with the Linux kernel.
> Ironically, despite still having the Linux kernel underneath
I misread what you were referring to is all. I thought you were saying Singularity ran on the kernel linux, but you were not saying that at all. You were talking about the GNUless Android and ChromeOS. Comprehension is underrated.
That + the Temporary Containers extension can make every new tab a new container by default. I also use Container Proxy so I can route traffic from each tab through a different proxy if needed (mitmproxy). I've wanted to go to Chrome but Chrome has nothing like per-tab sessions/isolation. I looked at first party isolation but it's vague and doesn't seem like what Firefox provides.
Firefox is the only browser that can do this. Also Google's UI decisions are just unilateral and awful. At least with Firefox we can still tweak some of it.
Still coming to my own conclusion here, but I wouldn't dismiss "easy button" as marketing. We keep hoping for easy buttons and reasonable default settings in things like openssl or pgp. I do like organizations that understand an easy button is the safest default. Is that what we have here?
I'm commenting only on the rhetoric, calling it an "easy button" stinks of marketing BS. People desiring simple straightforward tools is a separate subject.
Of course it’s marketing. My mom doesn’t want to set up uBlock and a script blocker and a Pihole. She’d love to click a button and be safer. What’s the issue here?
That I am on HN and someone trying to convince me their company isn't being shady is using evasive marketing speak instead of candor to an audience that clearly knows better than to believe the weasel words.
This is my own inexperience with OOP showing: I keep wondering if trait/protocol/interface programming has made inheritance entirely worthless. Love Rust. I've been working in Java for other reasons, but I've been noticing more and more Rust-like things in Java over the past year.
MacOS is great, but someday they will stop updating MacOS for this. I see it as a device with incredible longevity because of the fanless thermals and how it sips energy. I could see myself using it 10 years later. I want to run MacOS or Fedora Silverblue, with Silverblue being my true love. Immutable OS images <3
HAH, downvoted in seconds. Would not be surprised if there are Apple employees brigading this.