> Inside Swift is a slim language waiting to get out... and that slim language is just a safer Objective C.
Rust? Rust is basically a simpler Swift. The objective-c bindings are really nice too, and when you're working with obj-c you don't have have worry about lifetimes too much, because you can lean on the objective-c runtime's reference counting.
I think the way to think about it is that with Rust, it's as if all the goodness in Swift was implemented with in the "C" level, and the Objective-C but is still just library-level a runtime layer on top. Whereas Swift brings it's own runtime which greatly complicates things.
I would absolutely not call Rust a simpler Swift. Swift doesn't have and ownership/borrowing system, explicit lifetime for objects, much more expressive (and therefore complex) macro support...
I get that there's a tradeoff. Rust requires you to be way more explicit about what you're intending upfront and that can, in the long term, lead to simpler code -- but there's no dimension (depth-wise or breadth-wise) that I'd call Rust simpler.
While Swift now has the `borrowing` and `consuming` keywords, support for storing references is nonexistent, and the only way to return/store `Span`s, etc, is only possible through using experimental `@lifetime` annotations.
Swift is a nice language, and it's new support for the bare necessity of affine types is a good step forward, but it's not at all comparable with Rust.
Rust is still more complicated than Swift, but you needn't worry - the Swift team is flexing their muscles hard to ensure that Swift becomes the biggest, most complicated language on Earth and wins the complexity, cognitive burden and snail performance once and for all eternity. Their compiler already times out on the language, soon even an M7 will also give up.
One of my recurring language design hot takes is that it's easier to design for speed and then make it easy to use than it is to make it easy to use and then try to speed it up.
It kind of is, when the goal was to be TypeScript for C, before this was even a concept.
Now ideally we would all be using Modula-2, Ada, Delphi, VB, C#,.... and co, but given that even C compilers are nowadays written in C++, we make do with what we have, while avoiding C flaws as much as possible.
Except the entire design of swift is meant to make everything more automated.
* automated exclusivity with value types and value witness tables, classes as arc types (ie Arc<Mutex<T>>)
* automated interop with C/C++/Obj-C through the clang ast importer
Maybe they could have started with rust and added on what they needed, but why not build a new language at that point where things so fundamental are involved?
Source: I worked in lattners org at the time of swifts inception (on an unrelated backend) but that was the motivation. I also worked on the swift compiler for a little bit some years later on in my career.
> Maybe they could have started with rust and added on what they needed
Unlikely, I think, because of timelines. Swift’s first public release was in June 2014. Rust is a few years older (first public release in January 2012), but that wasn’t the rust we have today. It still had garbage collection, for example (https://en.wikipedia.org/wiki/Rust_(programming_language)#20...)
> Small codebases were always a good thing. With coding agents, there's now a huge advantage to having a codebase small enough that an agent can hold the full thing in context.
It is somewhat ironic that coding agents are notorious for generating much more code than necesary!
I find that these kind of optimizations are usually more about technical architecture than leetcode. Last time I got speedups this crazy the biggest win was reducing the number of network/database calls. There were also optimisations around reducing allocations and pulling expensive work out of hot loops. But leetcode interview questions don't tend to cover any of that.
They tend to be about the implementation details of specific algorithms and data structures. Whereas the important skill in most real-world scenarios would be to understand the trade-offs between different algorithms and data structures so that you pick an appropriate off-the-shelf implementation to use.
I agree. The "advanced" leetcode is about those last % of optimization. But when network latency is involved in a flow, it is usually the most obvious low hanging fruit.
There's a long-term economic problem looming around the loss of jobs: which is that most people's ability to command a share of our economic output (i.e. earn money) is tied to their value as a labourer. If that labour is no longer needed by those who control capital and thus allocation of labour resources (which is increasingly the case across many segments of our economy), then we end up with an economy where people increasingly struggle to earn a decent living.
Of course there are areas where that labour would be useful: healthcase, teaching, childcare, elderly care all come to mind (and there are many other examples). But our economy is not set up to enable this. The problem isn't supply side (difficulty retraining people to do the jobs), it's demand side: the people who need these services often don't have the money to pay for them. So the jobs are badly paid.
And it's a downward spiral: as wealth becomes more concentrated, demand for labour drops because those controlling the wealth already have their needs met and often don't care about the needs of others.
If history is anyhing to go by, then this will eventually lead to war and/or revolution.
reply