The key enabler is the camera. Manage a flagship level result in a Motorola, that’s the main reason people pay for High end devices nowadays.
I’m seeing enthusiasts go out of their way to get vivos and xiaomis now that they are surpassing the western counterparts based solely on that.
I think it’s doable, pixels did it with meh hardware for years. But I’m not sure if there’s enough overlap between people who care about selfie quality and open source enthusiasts.
Motorola Signature and Motorola Razr Fold are ranked above the Pixel 10 Pro on https://www.dxomark.com/smartphones/. Pixels have fantastic camera hardware and software which is fully functional on GrapheneOS which isn't something we need to lose on a Motorola flagship. There will be much better CPU and GPU performance via Snapdragon too. The compromises are mostly in terms of getting some security improvements while losing others but we'll still be able to meet all of our official security requirements.
I haven’t been able to see actual results that match those tests in the Motorolas sadly. Maybe it’s more accurate in technical terms but I haven’t found good results in practice.
>Pixels have fantastic camera hardware and software which is fully functional on GrapheneOS which isn't something we need to lose on a Motorola flagship.
This is very interesting to me! Does graphene OS manage to keep google’s processing? How does that work?
Pixel camera app is fully supported by GOS, you just install it from Play store (or from other sources). If you don't have Google Photos installed the last photo preview won't work, but you can install a 'shim' app that fixes it without need for Photos app: https://github.com/lukaspieper/Gcam-Services-Provider
I think this is an even clearer case than usual. With software engineers and office work you don’t have legal limitations on who can perform the work, but they exist for lawyers and doctors for example.
So if this is a tool, the fault lies fully in the user, and if this is treated as “another persons work” then the user knowingly passed the work onto someone not authorized to do it. Both end up in the user being guilty.
> With software engineers […] you don’t have legal limitations on who can perform the work
While in practice that is true, in theory this is why professional engineering accreditations (I mean like P.Eng., not little certificates) exist. Perhaps we will see a broader professionalization of the profession one day.
> With software engineers and office work you don’t have legal limitations on who can perform the work
Technically true, but if you want the IP to be covered by copyright you better make sure they're not using AI or you'll find out that there are some serious legal limitations in your future when you aim to either pick up investment or sell your IP.
An unqualified statement, the user has copyright over the elements they provide. In an image if they make manual edits for example, those are protected. In a modern agentic codebase the code itself is least valuable, what counts more are the specs and tests.
> So if this is a tool, the fault lies fully in the user, and if this is treated as “another persons work” then the user knowingly passed the work onto someone not authorized to do it. Both end up in the user being guilty.
I am particularly against this point of view, because we as a community have long touted how computers can do the job better and faster, and that computers don’t make mistakes. When there are bugs, they’re seen as flaws in the system and rectified, by programmers.
When there are gaps between user expectations and how the software works, it’s our job to manage those gaps and reduce the gap.
In the case of AI, we are somehow, probably because we know it’s non-deterministic, turning that social contract we had developed with users on its head.
Now, that’s just the way it is and it’s up to them to know if the computer is lying to them. We have absolved ourselves of both the technical and the non-technical responsibilities to ensure the computer doesn’t lie to the user, or subverts their expectations, or acts in a way contrary to human logic.
AI may be different to us in that it’s non-deterministic, but that’s all the more reason that we’re responsible to ensure AI adoption aligns to the social contract we created with users. If we can’t do that with AI then it’s up to us to stop chasing endless dollars and be forthright with users that facts are optional when it comes to AI.
You used to trust excel is reliable. You used to also choose to use excel, rather than “Bobby’s super duper calculator”.
The choice of tool is assumed to require expertise. I am not opposed to ai companies being sued for invalid expectations, but I don’t think they have at any point expressed that their current product can be used unsupervised in substitution of a professional.
In other words, companies are to blame for offering professional solutions that aren’t up to the task, but it’s up to every person to not use blindly whatever the conman of the day offers.
Everybody in sales in every software company in the world would be part of that community, I think. Some of the devs, too. Software was always marketed (and discussed with normal people) as something that could automate error-prone tasks, thereby eliminating the inevitable mistakes humans make when performing those tasks. Would Excel be the cornerstone of so many businesses if it sometimes gave the wrong value as a sum of a column?
That marketing (and the fact that, indeed, Excel can sum anything users throw at it without making mistakes) worked; now we have 3 generations of users who believe that once a computer "gets it" (ie. the correct software is installed and properly configured), it will perform a task given to it correctly forever. The fact that it's almost true (true in the absence of bugs and no changes to the setup, no updates, no hardware degradation, no space rays flipping important bits, etc.) doesn't help - that preceding parenthetical is hard to understand and often omitted when a developer talks to a non-developer.
We've always had software that wasn't as reliable as Excel - speech recognition and OCR come to mind. But in those cases, the errors are plainly visible - they cannot be "confidently wrong". Now we have LLMs that can be confidently wrong, and a vast number of users trained to think that software is either always right or, when it's wrong, it's immediately noticeable.
I don't think developers should bear the whole responsibility here - I think marketing had a much larger role in shaping users' minds. However, devs not clearly communicating the risks of bugs to users (for fear of scaring potential customers or out of laziness) over decades makes us partly responsible as well.
> Software was always marketed (and discussed with normal people) as something that could automate error-prone tasks, thereby eliminating the inevitable mistakes humans make when performing those tasks.
That's far from a community touting that computers don’t make mistakes.
> Would Excel be the cornerstone of so many businesses if it sometimes gave the wrong value as a sum of a column?
You mean like if it was running on a Pentium with the FDIV bug? :)
I agree there's a perception computer output is generally reliable, and that leaves users at the mercy of snake oil parrots that are generally unreliable and are sold without a warning. But I don't agree the cause is that touting.
>If you dig through Windows enthusiast communities
TIL those exist (genuinely).
I’ve never met anyone who likes windows, just people who put up with it for work/gaming and people who doesn’t care about the whole thing enough to move from the default (which is totally understandable).
There are people like this, although very small minority. I've met one at university - he was probably the first person to have Windows 8 laptop with a touchscreen, showing off to everyone how cool is was (at that time).
He was also really good at Microsoft Word, unironically - he made extensive use of custom styling and could format an assignment paper in like 30 seconds. He was super useful in group projects.
Wow it sounds like you're describing exactly me. All the way until the touchscreen laptop with Windows 8. Scary shit!
I used to laugh at the LaTeX masochists in college spending 15 minutes just to put a picture where they wanted the picture to be. They had to add like four 1-character modifiers to the "insert image" command, each of which meant "yes, really here", "no, don't move it to the next page" and "nono, really really here".
MS Word is properly great if you only use the custom style rules (basically CSS classes) at the paragraph level, and never directly apply styling (basically inline styles) except for super basic stuff like making a word italic. Has great referencing tools etc, fantastic formula editor and so on. And, well, you can use ultra modern human-machine interaction technology such as a mouse to choose where a picture goes and how big it is.
(They might've enshittified it since; the last paper I wrote was in 2010 and Word was pretty damn decent back then)
> MS Word is properly great if you only use the custom style rules (basically CSS classes) at the paragraph level, and never directly apply styling (basically inline styles) except for super basic stuff like making a word italic
MS word also has character styles (like a CSS style on a <span>). IMO you should use instead of bold or italic.
I'm aware, but Word's notorious "I clicked one button and it ruined the formatting of my entire document" stuff doesn't happen if you mark a word as italic or bold here and there in the middle of a sentence. The whole point of only using the style rules is to prevent it doing that.
But yeah for layout, ie headings and the likes, only ever use the styles, never "bold, bigger bigger bigger". Don't touch the line spacing button, etc etc.
IMO Word could do with a mode where those buttons are simply hidden. Want a bigger, fatter heading? Edit the heading style. There's no other way.
You can turn almost all of those buttons off in the settings and save it as a template. The only complaint I ever got was from somone who wanted to use the highlighter instead of the built-in comment management system.
Yep! Sorry I just edited that in. Win8 is thoroughly underrated to me. The file open/close dialogs were shit but the start menu was very good. I quite liked the fullscreen apps and am sad they got discontinued. Fullscreen IE browsing with full touch support (eg swipe for back/forward, no window chromes in the way to mis-click on etc) was very cool. It made every website feel like a fullscreen app. It almost made the terrible browser engine (it was still IE after all) bearable. Almost.
I'm pretty much still on the same setup now, Win11 plus touchscreen. You'll pry my touchscreen out of my cold dead hands. How will I rage-close a "try chrome" popup without a touch screen? You ever try to rage click something with a touchpad? Total non starter.
LaTeX is probably annoying as a Word replacement however RMarkdown with embedded LaTeX saved me sooooo much time on my economics homework in university. Being able to put code, equations, graphs generated by said code, etc... all in one file then simply generate a PDF...
I'm in the camp of liking Windows and having had to put up with Linux and MacOS for work. Inertia and familiarity does play a role, but as a dev there are things I really like (ETW + WinDbg immediately come to mind) & really miss on other OSes. I'm not there yet to join an enthusiast group though. ;)
I'm not sure these people like Windows as much as they like what it does for them, but they are willing to put in significant effort to remove the normal Windows roadblocks and annoyances, and thus are willing to hack and chop it to bits to get them closer to their end goals more quickly.
They're not like a car enthusiast who loves their MX5 out of its sheer beauty and feel, but rather they love their SUV because of it's big boot and because it gets them where they need to be, and thus are perfectly happy to tear out the old radio and uncomfortable seats.
The only difference is that car enthusiasts have many more options to choose from, while in OSes, if you're stuck with Windows, you're usually really stuck with it. Linux is certainly an option, but not one that is universally practical to apply.
I would genuinely enjoy Windows (with WSL) if Microsoft didn’t go to special efforts to make the experience horrible by shoving useless AI functionality, or advertising down my throat.
That's honestly more narrow-minded by you, than those "not moving from the default". Maybe you're the one that never went deep into the rabbit-hole of what's possible, or actually properly learned to use the OS?
As of Windows 11, window management is one of the few things Windows does better than macOS or Linux desktop environments out of the box, thanks to the Windows+Z tiling feature.
And then I use my touchpad to switch between virtual desktops and the jerky animation reminds me why I prefer to run non-game Windows applications remotely from a Mac.
i can confess discovering XP back then made me actively like Windows ; that was a long time ago though and with each new version my liking has been reaching new abysses
I was thrilled for new Windows releases between 3.11 and 8.1. I'm still reasonably fond of Windows for personal use. For now I can still de-enshittify it enough to get back the experience I'm used to, and it's comfortable and convenient. But I'm not sure if that will last for long, given the current trend.
That said, for work I've switched to Linux full-time years ago. Native containers are a killer feature for me, and the different UX and driver/dependency/repository issues aren't significant enough to make me want to go back to virtualization in Windows.
I mean not everyone cheers for the currently best soccer team either, it's partly about what you're invested in. If I had spent many years in Windows dev land I'm sure I would be arguing that side too.
I loved hacking Windows back in 2000-s! It was super-hackable and had PLENTY of very thorough documentation (Petzold's "Programming Windows"!).
You could even do a lot of kernel-level shenanigans with relative impunity thanks to its layered design. You could do some amazing stuff.
As an example, SWSoft released container ("lightweight virtualization") support for Windows in 2005, before containers were even a thing in the mainline Linux. They did that by adding a layer of redirection on top of the kernel objects without having access to Windows source code.
How is it bigotry? I've never met anyone who likes, say, "Baby Shark" (well, anyone with an age in double-digits). I'd be surprised if many -- possibly any -- exist. But if they do, well, de gustibus non est disputandum. None of my business, and I bear no ill-will towards them.
not sure how OP is acting with bigotry against windows users just because they were surprised that there are people who are enthusiastic about windows.
I share their sentiment, it's like discovering that there is a group of people who are Internet Explorer fans, or avid listeners of the generic no-name pop songs specifically made to be unremarkable background music they play in my gym to avoid paying royalties. It's just surprising since I haven't met anyone who doesn't just treat it as something to either put up with or replace with alternatives before.
“Seem important and accurate” is correct. It doesn’t imply actual accuracy, the llm will just use figures that resemble an actual calculation, hiding they are wild guesses.
I’ve run into the issue trying to use Claude to instrument and analyze some code for performance. It would make claims like “around 500mb ram are being used in this allocation” without evidence.
I use my MacBook for a mix of dev work and music production and between docker, music libraries, update caches and the like it’s not weird for me to have to go for a fresh install once every year or two.
Once that gets filled up, it’s pretty much impossible to understand where the giant block of memory is.
Yep, it is an awful situation. I'm increasingly becoming frustrated with how Apple keeps disrespecting users.
I downloaded several MacOS installers, not for the MacBook I use, but intending to use them to create a partitioned USB installer (they were for macOS versions that I could clearly not even use for my current MacBook). Then, after creating the USB, since I was short of space, I deleted the installers, including from the trash.
Weirdly, I did not reclaim any space; I wondered why. After scratching my head for a while, I asked an LLM, which directed me to check the system snapshots. I had previously disabled time machine backup and snapshots, and yet I saw these huge system snapshots containing the files I had deleted, and kicker was, there was no way to delete them!
Again I scratched my head for a while for a solution other than wiping the MacBook and re-installing MacOS, and then I had the idea to just restart. Lo and behold, the snapshots were gone after restarting. I was relieved, but also pretty pissed off at Apple.
It's just as bas on Windows. Operating Systems and Applications have been using the user's hard drive as a trash dumping ground for decades. Temporary files, logs, caches, caches of caches, settings files, metadata files (desktop.ini, .fseventsd, .Trashes, .Spotlight-V100, .DS_Store). Developers just dump their shit all over your disk as if it belongs to them. I really think apps should have to ask permission before they can write to files, outside of direct user-initiated command.
Because Apple differentiates their products by their storage sizes, they also sell iCloud subscription. There is zero (in fact negative) incentive to respect your storage space.
Been a while since I needed to use it there but it always amazed me that the Windows implementation of iCloud was more flexible in terms of location and ability to decide what files got synced.
Ho ho, except for where it puts the photos. Those go into a subfolder of the system photos folder, and there's no configuration (yet you can configure the "shared photos" location)
And then, should you try to set up OneDrive (despite Microsoft's shenanigans, it does simplify taking care of non-tech-savvy relatives), it will refuse to sync the photos folder because 'it contains another cloud storage' and you'll genuinely wonder how or why anyone uses computers anymore
I had the same problem and had some luck cleaning things up by enabling "calculate all sizes" in Finder, which will show you the total directory size, and makes it a bit easier to look for where the big stuff is hiding. You'll also want to make sure to look through hidden directories like ~/Library; I found a bunch of Docker-related stuff in there which turned out to be where a lot of my disk space went.
You can enable "calculate all sizes" in Finder with Cmd+J. I think it only works in list view however.
I’d recommend GrandPerspective:[1] it’s really good at displaying this sort of thing, has been around for over two decades, and the developer has managed to keep it to <5MB which is perfect when you’re running very low on space.
I use GP, would recommend as well; it generates great color codes tree maps of your storage. Once you get used to navigating it that way, you won’t go back.
Something like https://dev.yorhel.nl/ncdu with ("brew install ncdu") is great if you are okay with the command line. It's very annoying to drill down in the Finder especially if it's hidden directories.
A ton of thanks. This "hack" allowed to finally see some stuff that was eating up a lot of my space and was showing up as "System Data". It turned out the Podman virtual machine on my MacBook had eaten up more 100GB!
The trick is to reboot into recovery partition, disable SIP, then run OmniDiskSweeper as root (as in `sudo /Applications/OmniDiskSweeper.app/Contents/MacOS/OmniDiskSweeper`). Then you can find all kinds of caches that are otherwise hidden by SIP.
My immediate reaction to this is that the OS has a hard time establishing intent, and in some cases it probably should be this hard to delete data that's required for the system to boot on the grounds that you'd probably want it if you understood what it was, and ideally also hard for malware to delete data it doesn't want on your computer (forensically useful logs, backup copies of files encrypted by ransomware, etc.).
But none of this applies to caches and temporary files, which could be reasonably managed for 99% of users by adding a "clear all caches" checkbox in the reboot dialog with a warning that doing this is likely to slow down the system and increase battery usage for the next few hours, or to system-managed snapshots that mostly just need better UI and documentation.
UI transparency is my only real complaint. A reasonable amount of data the system wants to make difficult to delete is fine, so long as it clearly explains what it is and why. "System Data" is only acceptable as a description for the root of what should be a well-documented hierarchy.
Full Disk Access just gives an application the same filesystem powers that your user account has. For most users that means it has administrator level access, which is the 3rd highest tier.
There are two levels above an administrator-level account: 1) the root user can access files that an administrator can't (e.g. the files of
other users and certain system configuration files), and 2) the kernel and system processes can access "system" files that even root cannot - this is enforced by SIP.
Apple is quite liberal in what they hide away with SIP. It's possible for disk space to leak whereby the OS has decided to store some file that it doesn't need and there is no way to even list such files without following the above instructions - the only indication will be a mysteriously large amount of space taken up by the system.
It goes without saying that if you're going to delete system files you should make sure you know what you're doing.
I should not have to hack through /Libary files to regain data on a TB drive because Osx wanted to put 200gbs of crap there in an opaque manner and not give the user ANY direct way to regain their space.
The exclude for Volumes is necessary because otherwise ncdu ends up in an infinite loop - "/Volumes/Macintosh\ HD/Volumes/" can be repeated ad nauseam and ncdu's -x flag doesn't catch that for whatever reason.
I’m seriously contemplating blocking the news completely. It’s not like me being informed has any influence anyway, now that belief can be manufactured to such an extent. Might as well lower my blood pressure.
Skip the technical blocking step and internalize the fact that it is pretty much all irrelevant to your (average person’s) day to day life. Ideally, then even knowing it won’t affect your blood pressure. And the fact is, do you even have reason to believe the claims are true?
I am commenting in this thread to pass time, but I assume there is a such a high chance the linked article is not true, or at least has a hidden agenda, such that it should be ignored or treated as entertainment. Could be product placement, could be completely manufactured rage-bait, who knows, and more importantly, I have no reason to care.
Yup. I’m super social and extroverted, in the sense that I love meeting new people and if I’m introduced to anyone I make connections easily. But I can’t in a million years be the one breaking the ice.
This is in big part due to being born and raised in a large European capital. There’s unwritten barriers you respect as a social rule, and if someone breaks the rule you assume they’re trying to sell something or scam you. To me talking to a stranger unprompted feels as out of place as pulling my pants down in public.
It’s natural for these barriers to exist to make dense spaces liveable, but they do constrain you.
It's been very good for me. I don't even open claude.ai or or use Kagi Assistant even though I'm paying for it and have access to basically all the models. I interact pretty much exclusively via Claude Code. My recipe question turned into a recipe tracking project and recommendation engine designed to help force me to try making new things that expand my skills. I've also had good luck getting gluten / dairy alternatives for recipes since that's now a fact of life I have to deal with via my wife.
For product reviews, you've definitely got to make sure it's searching for sources and not just relying on outdated data. Some brands used to be very good and are today just coasting on their reputation. This is where phrases like "research this deeply" help it break out of the baked in biases.
I’ve just about got this working on ios using the a-shell app as a terminal to run the git commands, with shortcuts set to run them when Obsidian opens and closes.
I’m seeing enthusiasts go out of their way to get vivos and xiaomis now that they are surpassing the western counterparts based solely on that.
I think it’s doable, pixels did it with meh hardware for years. But I’m not sure if there’s enough overlap between people who care about selfie quality and open source enthusiasts.
reply