I get that some people prefer that, but I've been on macOS daily for 10+ years and still can't find the top level OS paradigms (window switching, over reliance on mouse/extremely poor keyboard support, lack of proper "maximized" windows) anything but painful.
It might be that growing up with windows ingrained certain gestures on my mind that I can't let go, but I feel like I've decidedly given it a shot. And I still hate it.
Yes, there are workarounds, but they're workarounds that require third party applications. I expected more from the OS.
Unless you live in windows ONLY, start with wgpu (a Rust library that supports WebGPU). It'll work with Metal/Vulkan/Web.
WebGPU is very similar to Vulkan, near a 1:1 in concepts. A lot of the new libraries (Metal, DirectX) are similar in fact, so learning WebGPU will get you extremely prepared for any of those.
WebGPU is better in many dimensions, but also worse in others people might care about. More appropriately, in this case, WebGPU is in many ways less approachable for novices.
One of the expected advantages of WebGPU is exposing the inner workings of the hardware's GPU support in a more explicit manner. This leads to really verbose code. This is similar to API movements seen in DirectX, Vulkan, and Metal.
WebGL was never exactly easy to read either, but you could get started more easily. It tried being something in-betweenm though, and ended up never being the best at either providing explicit behavior, nor being friendly.
The idea of WebGPU going forward is that it will make it easier for other libraries to use it in a more deterministic fashion. If you know how the metal works, you'll be able to optimize the sh*t out of it. But for novices, you'll end up using some library or engine that abstracts away a lot of the hardcore functionality like "let's get an adapter that supports this specific texture extension" or "let's reuse a binding group layout to save 2ns when binding uniforms". In the same way most people on the web use Three.js (instead of WebGL), you'll see libraries like Babylon or Three.js making use of WebGPU without really having to know about all that stuff. And that's the right solution.
Anecdotally, I had the opposite experience. I've wanted to dabble in parallel/gpu programming for a while but the fact that all the material forced me to care about triangles and matrix transforms for nontrivial case examples turned me off.
I've recently been playing with WebGPU and while it's still a bit of a boilerplate nightmare (which I wouldn't presume to know how to do better).. it was far more approachable.
Wrapping my head around buffers and layouts, and the fact that wgsl is a huge pain in the neck to debug, took a while.
I've built myself a really rudimentary perlin noise generator using compute shaders, and managed to pipe that into a rendering shader that uses two triangles to render part of the noise field onto a canvas really smoothly.
Trying to do some fancier compute stuff now, and overall the primitives are relatively straightforward. It's just that the documentation around the binding types and layouts, resource limits, and command-encoder/pipeline operational semantics are poor right now.
The number of people in this site that can't accept that half of the transactions was fraudulent is astounding.
If you're in the space, going "well actually" in every article critical of blockchain based finance doesn't help your mission at all. Ultimately you're just making up excuses for the toxic parts of the system.
No, people making millions out of misinformation is the problem. They have learned to appeal to primitive instincts and are making their fortunes out of that.
Attempts are curtailing the spread of dangerous misinformation wouldn't happen without the dangerous misinformation first.
Rather than being mad for "people calling for censorship", maybe you should be mad for people making a profit of deaths, or the general anti-science, anti-education culture that consumes and amplifies it.
Going by that definition, these guys are the ones that were attempting to reuse a name that already belonged to someone else. They can't have their cake and eat it too.
I get that some people prefer that, but I've been on macOS daily for 10+ years and still can't find the top level OS paradigms (window switching, over reliance on mouse/extremely poor keyboard support, lack of proper "maximized" windows) anything but painful.
It might be that growing up with windows ingrained certain gestures on my mind that I can't let go, but I feel like I've decidedly given it a shot. And I still hate it.
Yes, there are workarounds, but they're workarounds that require third party applications. I expected more from the OS.