While I get the butterfly keyboard hate (though mine is so far still perfectly fine) the USB-C ports were amazing. I have a 2016 MBpro and that thing still cooks really well. As somebody who worked in video production those ports were a godsend. No more waiting around for footage to transfer all the damn time. Complete game changer. Plus with one or two quality docks I could plug-in literally anything I ever needed. With the AMD GPU i could also edit pretty beefy 4K with no proxies most of the time. In 2016/2017 that was pretty awesome. Plus last good intel machine they made IMO, so good compatibility with lots of software, target display mode for old iMacs, windows if I wanted it, etc.
Probably my favorite laptop I’ve ever owned. Powerful machine, still sees work, runs great.
It introduced USB-C before it was ubiquitous even on smartphones, at least in my area. All the peripherals still needed a dongle, it was the dongle era. The keyboard was okay to type on once I got used to the short travel, but the keycaps easily broke off, and dust would get in and the keys wouldn't register. Also, the whole laptop would get very hot, at least the 13" pro without the touchbar. I prefer the older 2015 model, before the butterfly, that's the one I had at work but had to give it up, and I regret waiting for the new models instead of purchasing the same one.
Like I said, totally get the keyboard hate. Mine just turned out perfectly fine.
People hated the dongles but again I could hook up everything. Dozens of connections with throughput I could never get before. It was fantastic for my needs and still is!
Or you could maybe learn how to use the OS, in linux lingo RTFM. I don't want to be rude, but the critique was very flippant, the arguments vague, all about expectations based on years using a different OS, doesn't seem you want to give it a fair chance.
I gave both generalized and highly specific cases where I felt the UX failed. I referenced principles of UX as well as literal "here is what my experience was in a concrete story".
> , all about expectations based on years using a different OS
No? I mean, again, funny. I explained how I've been using MacOS for years. Actually a decade, now that I count it out.
Plenty of people use an OS for years without learning. And you admitted to spending time in the terminal, which indicates lack of will to try and learn macos shortcuts, gestures, windowing model, spaces, and so on. And the comment used sweeping generalizations, without referring to any specific principles broken which aren't just personal dislikes or unfamiliarity with a different way of doing things.
> I gave both generalized and highly specific cases where I felt the UX failed.
No guidelines named, no principles defined. No comparison standard is established.
The earlier fullscreen story is a specific case, maybe a discoverability argument, but not not that UX violates every principle. MacOS spaces and fullscreen apps follow a workspace concept, it's not a window resize mode.
> Asymmetric user experiences
What’s asymmetric is not the command — it’s the spatial context. The claim that it’s violated is arguable.
> Heavily reliant on gestures
Not sure which guidelines this breaks, but every gesture has a keyboard shortcut alternative, there is mission control key, menu bar, dock.
> And you admitted to spending time in the terminal, which indicates lack of will to try and learn macos shortcuts, gestures, windowing model, spaces, and so on.
It indicates no such thing, other than that my preferred UX on a mac has landed on the terminal. It doesn't indicate whatsoever that I never tried to learn, or that I haven't learned, unless you presuppose that learning would necessitate using the computer a specific way.
Indeed, I have learned quite a lot of the various gestures, spaces, etc, unsurprisingly. I avoid them because they suck, and the learning experience was shit.
> And the comment used sweeping generalizations, without referring to any specific principles broken which aren't just personal dislikes or unfamiliarity with a different way of doing things.
All design principles are going to boil down to personal dislikes lol but no, nothing was "unfamiliarity" you can stop saying that thanks.
> No guidelines named, no principles defined. No comparison standard is established.
I could cite guidelines if you think it would help. Microsoft released a UX guideline years ago justifying why magic corners etc are a bad idea. Of course, they obviously don't follow that guide these days. What would you like?
I'm not interested in debating this. I'm perfectly fine with how I've expressed myself, I'm just not motivated enough this late in a Friday to get more detailed, so you'll have to just try to decipher what I've said and find if there's value to you or reject it, which I think is your prerogative.
Right now you can just tell claude to generate an ascii diagrama, or even svg. I did a few days ago when I wanted to share a flow diagram of one particular flow in our app.
Can this be applied to camera shutter/motion blur, at low speeds the slight shake of the camera produces this type of blur. This is usually resolved with IBIS to stabilize the sensor.
The ability to reverse is very dependent on the transformation being well known, in this case it is deterministic and known with certainty.
Any algorithm to reverse motion blur will depend on the translation and rotation of the camera in physical space, and the best the algorithm could do will be limited by the uncertainty in estimating those values.
If you apply a fake motion blur like in photoshop or after effects then that could probably be reversed pretty well.
> and the best the algorithm could do will be limited by the uncertainty in estimating those values
That's relatively easy if you're assuming simple translation and rotation (simple camera movement), as opposed to a squiggle movement or something (e.g. from vibration or being knocked). Because you can simply detect how much sharper the image gets, and hone in on the right values.
I recall a paper from many years ago (early 2010s) describing methods to estimate the camera motion and remove motion blur from blurry image contents only. I think they used a quality metric on the resulting “unblurred” image as a loss function for learning the effective motion estimate. This was before deep learning took off; certainly today’s image models could do much better at assessing the quality of the unblurred image than a hand-crafted metric.
I wonder if the "night mode" on newer phone cameras is doing something similar. Take a long exposure, use the IMU to produce a kernel that tidies up the image post facto. The night mode on my S24 actually produces some fuzzy, noisy artifacts that aren't terribly different from the artifacts in the OP's deblurs.
The missing piece of the puzzle is how to determine the blur kernel from the blurry image. There's a whole body of literature on that that's called blind deblurring.
Oof, I hope not. I wonder if the architecture for GPU filters migrated, and this feature didn't get enough usage to warrant being rewritten from scratch?
It can never be user-friendly enough if how windows does things is the yardstick. Windows users bemoan about how terrible Macs are all the time just because things are done differently, and they don't even try to figure it out. If it doesn't work like windows it's not good enough.
If you install a distro that uses KDE Plasma, you're already most of the way there. Not just because of it's design similarities to Windows but it's the desktop environment that's been getting the most financial support lately and has seen the most rapid improvement.
I personally prefer it at this point. Dolphin blows away explorer, window management is more slick and more flexible out of the box and it also happens to be deeply customizable.
The ASUS laptop I bought has a litany of issues: blue screens, audio dying, won't wake from sleep. MSFT, Nvidia and ASUS all blame each other.
I have a feeling modern Linux on this machine wouldn't be worse than what it shipped with. The days of fighting for 3 days with audio or printer drivers after an install are mostly behind us.
If Linux isn't uploading their FDE keys to Microsoft servers by default, Windows users will get scared and start crying. Needless to say, their tastes and desires should never be entertained.
reply