Hacker Newsnew | past | comments | ask | show | jobs | submit | the_benno's commentslogin

The best source is usually authors' websites. You can find a free copy of just about any modern CS paper by googling its title in quotes with "filetype:pdf".


You can argue that it's growing and/or promising, but there is absolutely no way that the actual research funding coming out of blockchain-land can hold a candle to the big formal methods labs at major tech firms or traditional academic research funding.


I’m not sure to be honest because I don’t know how much comes from major tech firms - but I do know someone who financed their own research center and have seen probably billions in the last year or two so it feels like a sizeable amount compared to standard academic funding for fairly niche stuff.

And if you look at it from a technical perspective where your options are TLA+, Coq, Agda, Isabelle, etc, I think you’ll find most of the core devs are involved in blockchain somehow nowadays. At least that’s what it feels like.


Who of the Isabelle core devs is involved in blockchain?


None of them -- Isabelle is developed at University of Cambridge and TU Munich by academics. The parent commenter seems to have some mistaken notions and assumptions about formal methods and PL research.


Yes but a lot of academics moonlight in blockchain as consultants on their tools. Some are more public than others, such as PL guys like Philip Wadler, but just because they’re a professor at a university doesn’t mean they’re not also working on external startups.


That's why I was asking, I thought maybe you know more about it than those vague pointers. Lean taking in 20 million from cardano was pretty public, after all.


Anders Moeller and Michael Schwartzbach's book [1] on static program analysis is a fantastic resource, with (I think) a great balance of theory and practice. If you want to get really deep into the theory of program analysis, Patrick Cousot just published an incredibly thorough book on abstract interpretation (just got my copy this week, so haven't fully explored enough to have much of an opinion on it as a pedagogical resource)

[1] cs.au.dk/~amoeller/spa


Thanks, yes I've read the Moeller book and I agree with your evaluation. I wasn't aware of the Cousot book though, I'll be sure to check it out.


No, I wouldn't say that category theory is a prerequisite for domain theory in general.

I'm most familiar with domain theory from Winskel's "Formal Semantics of Programming Languages" (great book, cited as Win93 in the linked PDF), which has a whole chapter on the subject with no reference to category theory.

If you get deeper into domain theory (e.g. by working through this book) surely some category theory will be useful, but it's not a foundational prereq.


What is this gobbledygook? These technical terms have meanings that you clearly either (a) do not understand or (b) are abusing to trick less-educated folks into thinking your political opinions are authoritative and intellectual or whatever.

To an audience that _does_ speak theoretical CS, your sprinkling in of its terminology weakens whatever point you're trying to make and comes off instead as silly and pretentious.


Hey... I'm not tricking anyone. I used a term that was related to work I was currently doing. It's not the best term, you're right. But my meaning was quite clear.

If you base UBI on cost of living, and cost of living depends on UBI, then you find yourself in a feedback loop. My use of turing complete was not really accurate, but it was the context my mind was in when writing this comment.

I make mistakes and sometimes speak unclearly. No need to be pedantic.


Your meaning was indeed clear but I find your reasoning flawed and the cargo-cult intellectualism... silly and pretentious.

The main flaw, to me, is the 1:1 correspondence you draw between the magnitude of UBI-style benefits and cost of living. That may be true (at least under some simplified/idealized econ101 assumptions) if UBI was the sole source of money in the economy, but that's obviously not the case. The crux of the "UBI doesn't cause spiraling inflation" argument is that UBI accounts for a sufficiently small fraction of total incomes that its benefit (greater spending power for lower incomes, and the downstream economic benefits of that spending) outweighs its nonzero but noncrippling inflationary effect.

That is, though the $10 cash benefit you describe may of course increase COL, it does so by some amount between $0 and $10 that is influenced by all sorts of factors you're glossing over.


I dunno what cargo-cult intellectualism is. Contrary to what it may seem, I rarely read economics blogs / political philosophy, etc.

It just seems obvious to me that if you increase everyone's income by $10, price of goods will go up by $10.

> The crux of the "UBI doesn't cause spiraling inflation" argument is that UBI accounts for a sufficiently small fraction of total incomes that its benefit (greater spending power for lower incomes, and the downstream economic benefits of that spending) outweighs its nonzero but noncrippling inflationary effect.

How could you have greater spending power if we agree that UBI's inflationary effect is proportional to its amount. The moment you give the money out, the spending power goes down.

In my view, this will just lead to the rich getting richer, because companies end up providing the services people will pay money for, and these profits go into the pockets of the rich, who don't really need the money for life necessities, so instead use it as capital.

The poor are no better off and are incentivized to spend the money because of inflationary effects (saving the money results in it having less spending power when they want to sell it), while the rich have no incentive to spend the money since they're already wealthy.

> That is, though the $10 cash benefit you describe may of course increase COL, it does so by some amount between $0 and $10 that is influenced by all sorts of factors you're glossing over.

Can you identify, name, and explain those factors. You accuse me of 'glossing over' some factors in my 'simple' explanation, while you yourself do not identify any of the factors that supposedly would not cause COL to increase.


Sure: literally everything past the first chapter of any microeconomics textbook. Come on, your position is "it just seems obvious", so I'll just say that it _is_ obvious that economics -- a social science full of interconnected and confounding factors -- is not so simple as "if everyone has 10 more dollars then everyone's cost of living goes up by exactly 10 dollars."


Again, why don't you share your wisdom?


My "wisdom" is that microeconomics is complicated... Hardly a controversial statement. I was being serious about a microecon text, but wikipedia would work just fine too.

I'm not on here to argue with people and it seems you have some sort of ideological bone to pick, so this is going to be my last reply. All the best


Though it is internally-built, Hack is already open source at https://github.com/facebook/hhvm/.

FB uses a pretty wide array of languages internally -- I don't know if they release statistics publicly, but you can filter/search their open-source projects by language at https://opensource.fb.com/projects/#filter.


I believe Hack is also used by a number of other companies, the most prominent being Slack [1]. FB probably uses almost every language in at least some context: Hack, C++, Python, Haskell, OCaml, Rust, JS, Obj-C, Swift, Java, Kotlin are all ones that I’ve heard referenced by friends who work there, though that’s likely nowhere near exhaustive.

[1] https://slack.engineering/hacklang-at-slack-a-better-php/


And Erlang (mainly in WhatsApp)


because it's not a zero sum game? those union busting talking points keep you focused on your less-fortunate colleagues and and not the obscenely wealthy guy screwing the both of you over.


But if i turn over to a union there is small assurance i will not take some pay cut. I doubt people making so much are top of bargaining priority list, probably more rank-and-files level are their focus. And then I have turned over my rights to majority, so it is suck up what my coworker wants or leave. It is similar for employer, i must suck up what it want or leave, but I already known these terms are okay with employer. Since i am not in majority bracket of workers this probably is not good.


This is not how unions work. (source: am union member)

Your claims and arguments are against some imagined shitty union-like thing that does not much resemble an actual union.


For what it's worth, here are also [1] the Framework community forum thread about Arch and [2] the Arch wiki page about Framework in case you're interested.

I'm still waiting for mine to arrive (in the next batch) but I plan to install Manjaro when it does, and am cautiously optimistic that it'll be mostly painless.

[1] https://community.frame.work/t/arch-linux-on-the-framework-l... [2] https://wiki.archlinux.org/title/Framework_Laptop


Wow, amazed by how short that Arch/Framework wiki page is! I went there expecting a long scrollable entry, but there's actually very few issues.


F# is an ML-family language, and ML stands for meta-language -- it was designed for language tools and compilers! Andrew Appel's Modern Compiler Implementation in ML is a great resource for code examples that aren't exactly F#, but will look damn close. The book has also been published with C and Java, if you want side-by-side comparisons


ML was designed to be the metalanguage for the LCF proof system, used to write proof tactics. It was then recognized to be interesting in its own right as a programming language.


Oh, thanks for the correction and apologies for the inadvertent misinfo.

Your comment sent me reading the HOPL paper about SML -- I think I (based on folklore knowledge/informal conversations) conflate the original development of ML with subsequent development and standardization around SML. All of this was well before my time so it's quite interesting to read about some of the details.


They actually dropped support for diffing and incremental parsing a few months back, when they changed up some internal structures to what they call "precise AST datatypes" (which made diffing quite tricky in the type system).

My impression (from the outside looking in) is that there weren't many people making use of those capabilities, so it wasn't a high priority feature. A bit of a shame, though -- it's a really cool tool and I enjoyed building some things on top of it. I switched to a different front-end, though, when I saw that they had removed those features and all mention of incrementality/differencing from the documentations.

I'd be happy to be wrong about this, if anyone has more up-to-date information about semantic!


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: