This is nice. Sounds like it wouldn't solve the slow animation when entering or leaving full screen mode though. I'm fed up enough with macOS's poor window management (among many other things) that I'm looking for MacBook alternatives.
The M5 chip is way ahead of Intel's latest, even Panther Lake. But the Snapdragon X2 Elite looks like a viable alternative. It's the only competitor with comparable single core performance, and it comes with 48 GB of extremely fast RAM for a reasonable price with great battery life. Unfortunately Linux support isn't really there yet, but hey M5 MacBooks don't support Linux well either.
if youre using a firefox based browser, slow fullscreen for media can be fixed by setting the full-screen-api.macos-native-full-screen flag to false in about:config
> your options are basically having a GLES renderer that you can restrict to WebGL2 (so no compute shaders, etc. and other things that make desktop OpenGL acceptable for writing a modern renderer) or having to abstract over Vulkan/WebGPU yourself
I don't understand this complaint. What's worse about using WebGPU over using GLES? Seems like a strict improvement. You can use WebGPU anywhere, you're not required to "abstract" over Vulkan. If you're talking about using it outside of the web, you just choose wgpu or Dawn as your implementation, it's the same API and even the same implementation as you'd get in a browser.
I don't believe anyone claiming that Satoshi is still alive. There is zero chance any human who put so much effort into creating something would remain silent while it became a $2 trillion phenomenon that succeeded beyond their wildest dreams. Satoshi is certainly dead.
I once became so famous that a community of several hundred people knew and recognized my name for a few years. At the time, it was very ego-flattering, and I was delighted to have done something that had such a big and positive impact. However, as an experience it really did not agree with me, and even this very minor level of fame has left me resolved to never, ever, ever become that famous again if I can help it.
I don't think I am unique in that. In fact, I perceive that it is very normal for public figures, not merely to fade from public attention, but to actively seek out seclusion.
While I'm not Satoshi, I would put the odds of someone in such a position of maintaining radio silence far from "zero chance". I would put it more around 70 or 80 percent. And at any rate, it is certainly what I would do.
This is a ridiculous argument "zero chance" that completely discounts the possibility (or in all likelyhood, probability) that the creator may be compelled to stay silent, in jail, etc.
Adam Back is the well-off CEO of a company in the blockchain space. From that position, he gets to continue to use his expertise in the field with plenty of connections while having more than enough money without needing to risk revealing himself as Satoshi or risk de-stabilizing Bitcoin's value by using Satoshi's known wallets. It seems like the best possible outcome for someone in Satoshi's position.
I'll at least agree that I don't think any other living candidates for Satoshi make any sense. I can't believe someone who started a brand new influential field of study could fully exit from it while fully avoiding the proceeds from it, as would be necessary to believe in any other living candidate.
If I were to invent something like bitcoin, I would use your exact logic to decide to burn the keys. I couldn't trust myself, so I would remove the possibility of agonizing over it. Obviously, I still might feel regret, but I'd choose the potential regret over the potential agony.
Hell, even if I didn't burn the keys initially, I might do it as I observed it starting to take off. I'd be more attached to the idea and its success than to the idea of being filthy rich (and at risk of jail, extortion, and murder). It would feel like a giant middle finger to the parts of the system I disliked.
My theory is that Satoshi is a persona created by Adam Back and Hal Finney.
They probably devised something where both needed to agree and sign something for Satoshi to act. This also allowed them to say "I’m not Satoshi Nakamoto".
They also probably ensured that anything that belongs to Satoshi required both of them. The death of Hal Finney ensure that Satoshi died definitely.
But they may have "killed" him before by burning the keys because, when Bitcoin started to become a success, they probably anticipated the need to "kill" satoshi (few remember but Bitcoin passing 1$ was considered as a crazy bubble at the time! Some become millionnaires and exited when BTC did the 30$ bubble. Satoshi’s stack was already closely observed, bright mind of that time would have anticipated the need to kill it). Or it was just that "satoshi" was not needed or they accidentaly deleted some keys.
I'd like to think that if I'd come up with something like this, I'd have quickly gone "oh shit" and realised it'd be hard to access the earliest coins without raising unwanted attention, and started mining with multiple different keys, and actively moved those coins around. If Satoshi is still around, I'd expect he has more than enough money without the need to risk the upheaval touching those earliest keys would cause.
This may be the most convincing theory I've heard.
I don't believe any live human being has the wherewithal to not use any of the $100B+ in the Satoshi wallets, which has led me to believe it was Hal Finney. Back and Finney both being in on it would explain some of the email timing as well
If Satoshi has at least half a brain (which we know he has), he knows that moving even one coin from this historical stack could crash the whole Bitcoin economy. So, no, he doesn’t have trillions of dollars.
(It is so basic, I’m still confused by people asking how could he not sell those coins. Because the answer is simple: to whom?)
> would remain silent while it became a $2 trillion phenomenon
I can see how it might be preferable. Satoshi has an incredible amount of wealth in a form that’s very easy to transfer anonymously. Anyone that admits to being him will be a huge target.
It's a decent model if the benchmarks are to be believed, but it won't be close to Opus in usefulness for programming. None of these benchmarks completely capture what makes a model useful for day-to-day coding tasks, unfortunately. It will take time for them to catch up, and Opus will keep improving in the meantime. But it's good to have more competition.
Benchmarks miss the thing that actually matters for agentic use: how does behavior change over a multi-day horizon? A model that scores well on one-shot coding tasks can still make terrible decisions when it has persistent
state and resource constraints. That's where you see the real gaps between models.
The price is 5x Opus: "Claude Mythos Preview will be available to [Project Glasswing] participants at $25/$125 per million input/output tokens", however "We do not plan to make Claude Mythos Preview generally available".
I didn't see this at first, but the price is 5x Opus: "Claude Mythos Preview will be available to participants at $25/$125 per million input/output tokens", however "We do not plan to make Claude Mythos Preview generally available".
It's so ridiculous that Google made a custom SoC for their phones, touting its AI performance, even calling it Tensor, and Apple is still faster at running Google's own model.
Google really ought to shut down their phone chip team. Literally every chip from them has been a disappointment. As much as I hate to say it, sticking with Qualcomm would have been the right choice.
If this Gemma tokenizer I found online is accurate then my Pixel 10 Pro XL is getting ~22 tok/s on Gemma 4 E2B using the NPU, vs. 40 tok/s is what people are saying the MLX version gets on iPhone.
Actually I found official performance numbers from Google saying iPhone gets 56 tok/s and Qualcomm gets 52. They don't even bother listing Tensor in their table. Maybe because it would be too embarrassing. Ouch! https://ai.google.dev/edge/litert-lm/overview
My problem isn't that these people exist in the world. My problem is they're increasingly drowning out other voices in a community I'm part of. I would prefer significantly more active moderation against politics and general non-technical negativity on this site.
The M5 chip is way ahead of Intel's latest, even Panther Lake. But the Snapdragon X2 Elite looks like a viable alternative. It's the only competitor with comparable single core performance, and it comes with 48 GB of extremely fast RAM for a reasonable price with great battery life. Unfortunately Linux support isn't really there yet, but hey M5 MacBooks don't support Linux well either.
reply