I understand that the project is in his early stages, and documentation (README) is very much WIP. So, couple questions
1. "SSD backed" - on-the-fly 100% time (like KVRocks), or requires same "save" like Redis does?
2. (if it is like kvrocks) Do you have any perf numbers for DB sizes that are, say, 10x of RAM size?
Some simpler benchmark table would be great. May I suggest Ollama on base machine, Ollama with T1, Ollama with T1+T2 etc. on midsize and big models to compare token/sec?
Niklaus Wirth died in 2024, and yet I hope he is having a major I-told-you-so moment about people blaming Pascal's bounds checking to be unneeded and making things slow.
My CS college used Turbo Pascal as a teaching language. I had a professor who told us "don't turn the range and overflow checking off, even when compiling for production". That turned out to be very wise advice, IMHO. Too bad C and C++ compiler/language designers never got that message. So much wasted to save that less than 1% performance gain.
To this day, FPC uses less ram than any C compiler, A good thing in today's increasingly ramless world and they've managed this with way less developers working on it than its C compiler equivalent, I can't even imagine what it would look like if they had the same amount of people working on it. C optimization tricks are hacks, the fact godbolt exists is proof that C is not meant to be optimizable at all, it is brute force witchcraft.
At a certain point though, something's gotta give, the compiler can do guesswork, but it should do no more, if you have to add more metadata then so be it it's certainly less tedious than putting pragmas and _____ everywhere, some C code just looks like the writings of an insane person.
> […] C optimization tricks are hacks, the fact godbolt exists is proof that C is not meant to be optimizable at all, it is brute force witchcraft.
> At a certain point though, something's gotta give, the compiler can do guesswork, but it should do no more, if you have to add more metadata then so be it it's certainly less tedious than putting pragmas and _____ everywhere, some C code just looks like the writings of an insane person.
There is not even a single correct or factual statement in cited strings of words.
C optimisation is not «hacks» or «witchcraft»; it is built on decades of academic work and formal program analysis: optimisers use data-flow analysis over lattices and fixed points (abstract interpretation) and disciplined intermediate representations such as SSA, and there is academic work on proving that these transformations preserve semantics.
Modern C is also deliberately designed to permit optimisation under the as-if rule, with UB (undefined behaviour) and aliasing rules providing semantic latitude for aggressive transformations. The flip side is non-negotiable: compilers can't «guess» facts they can't prove, and many of the most valuable optimisations require guarantees about aliasing, alignment, loop independence, value ranges, and absence of UB that are often not derivable from arbitrary pointer-heavy C, especially under separate compilation.
That is why constructs such as «restrict», attributes and pragmas exist: they are not insanity, they are explicit semantic promises or cost-model steering that supply information the compiler otherwise must conservatively assume away.
«metadata instead» is the same trade-off in a different wrapper, unless you either trust it (changing the contract) or verify it (reintroducing the hard analysis problem).
Godbolt exists because these optimisations are systematic and comparable, not because optimisation is impossible.
Also, directives are not new, C-specific embarrassment: ALGOL-68 had «pragmats» (the direct ancestor of today’s «pragma» terminology), and PL/I had longstanding in-source compiler control directives, so this mechanism is decades older than and predates modern C tooling.
There's a blog post from Google about this topic as well where they found that inserting bound checking into standard library functions (in this case C++) had a mere 0.3% negative performance impact on their services: https://security.googleblog.com/2024/11/retrofitting-spatial...
It works, but the best in me I cannot explain fully first 3 symbols.
/*?sr/bin/env finds /usr by expanding *? to a first matching directory. But why not just /*usr/ instead?
Last decades? wipes the tear You surely forgot /s at end, I hope. The evil incarnation what is called "Samsung fridge" that I have in my kitchen required repairman's attention just 3 months after the purchase. And then every 3 months after. And children sacrifices, sorry - steam baths, for the ice maker every month or so.
Samsung appliances - never again.
PS. Repairman told me that Samsung have fixed already one of the problems my fridge has by the time he looked at it, kind of hidden recall and fix. Fridge's version (yes, they have versions) have advanced like 7 iterations already from the time I bought it. That means there were at least 7 serious design/manufacturing problems that they had to fix.
I mean.. that's based on the assumption that they actually care about delivering a working appliance.. As long as the spyware works, they don't really care about the "cooling food" part..
good point, it's within their audible frequency which is between 40Hz to 60kHz (65kHz in some dogs); the knife is 40kHz, so it will drive them completely crazy.
reply