There is no "Bluesky App Store" thing. The Atmosphere integration is an optional feature that adds a social layer on top of the registry, using the AT Protocol.
It is fairly common, yes. Sometimes those compilers (or interpreters) aren't the primary implementation, but it's certainly a thing that happens often.
Most of the Rust compiler is in Rust, that's correct, but it does by default use LLVM to do code generation, which is in C++.
Some pattern matching occurs in the function match-case-to-casequal. This is why it is preceded by a dummy implementation of non-triv-pat-p, a function needed by the pattern matching logic for classifying whether a pattern is trivial or not; it has to be defined so that the if-match and other macros in the following function can expand. The sub just says every pattern is nontrivial, a conservative guess.
non-triv-pat-p is later redefined. And it uses match-case! So the pattern matcher has bootstrapped this function: a fundamental pattern classification function in the pattern matcher is written using pattern matching. Because of the way the file is staged, with the stub initial implementation of that function, this is all boostrapped in a single pass.
seems neat but the fact that "almost all" of their tools are built "with the assistance of LLMs" gives me a visceral reaction. i do not want to use your ai generated slop! how is it enjoyable to program in such a way where you do none of the work? where you learn nothing?
I find it incredibly enjoyable to program this way.
Working with LLMs is the difference between typing out this code myself:
async function fetchGitHubApiData() {
try {
// First API call to get repository contents
const contentsResponse = await fetch('https://api.github.com/repos/mdn/browser-compat-data/contents');
if (!contentsResponse.ok) {
throw new Error(`HTTP error! status: ${contentsResponse.status}`);
}
const contentsData = await contentsResponse.json();
// Find the object with name: "api"
const apiObject = contentsData.find(item => item.name === "api");
if (!apiObject) {
throw new Error('Could not find object with name "api"');
}
// Extract the git URL from _links
const gitUrl = apiObject._links?.git;
if (!gitUrl) {
throw new Error('Git URL not found in _links');
}
// Second API call to fetch the git data
const gitResponse = await fetch(gitUrl);
if (!gitResponse.ok) {
throw new Error(`HTTP error! status: ${gitResponse.status}`);
}
const gitData = await gitResponse.json();
// Return the tree array from the git data
if (!gitData.tree) {
throw new Error('Tree data not found in git response');
}
return gitData.tree;
} catch (error) {
console.error('Error fetching GitHub data:', error);
throw error;
}
}
Compared to typing this prompt (which probably took less than 30 seconds, I should have timed it):
Write me an async JavaScript function
which starts by doing a fetch() to
https://api.github.com/repos/mdn/browser-compat-data/contents
and then looping through the returned
array of objects looking for the one with
a "name" key set to "api" - on that
object its should look for the _links.git
property which is a URL
It should fetch that URL, which return
JSON with a "tree" key - it should then
return that array to the caller
I'm still doing work: I had to research how the GitHub trees API works, and I have to both review and then actively test the resulting code to confirm it actually works as intended.
I'm still learning things - in this case a quite neat way of structuring an async JavaScript function that makes multiple API calls, plus some error handling patterns I may not have considered.
The thing that's really fun here is that I get to build things I simply wouldn't have built otherwise. I did not want an MDN timeline visializer enough to spend more than 15 minutes experimenting with the idea (which turned into about an hour when you add the iterations and write-up).
No he is not. Simon has been blogging extensively regarding the dev of quick tools using a few LLMs like the ones from OpenAI and Anthropic. This new tool is no exception.
(Of course there is more to it, using LLM to be productive requires some skill on their own)
reply