no conditions! what, you didn't know your content would be free for LITERALLY EVERYONE when you made your account? and that you can't delete your comments? and it's a free for all? well, that's on you buddy, not HN, definitely not HN in any way
>It's less proof of work and just annoying to users, and feel good to whoever added it to their site,
this is being disproved in the article posted:
>And so Anubis was enabled in the tar pit at difficulty 1 (lowest setting) when requests were pouring in 24/7. Before it was enabled, it was getting several hundred-thousand requests each day. As soon as Anubis became active in there, it decreased to about 11 requests after 24 hours, most just from curious humans.
apparently it does more than annoying users and making the site owner feel good (well, i suppose effective bot blocking would make the site owner feel quite good)
> After 2 minutes at 150 kHashes on mobile, I finally see the first pixel of the progress bar filling up. Seems like it will take hours or a day to finish. Some estimate would have been nice.
Literally the grandparent of the comment chain you're responding to.
Yes, Anubis is just non standard and obscure, the proof of work bit is completely irrelevant (except for getting people on their phone to not visit your website).
how about this analogy: I created a most tasty cookie recipe. I give it out for free, and all copies have my name because I am vain person who likes to be known far and wide as the best baking chef ever. Is it ok to get the recipe, remove my name, and write in LLM-Codex as the creator? again, i'm ok with giving the recipe for free, i just want my name out there.
>Is it ok to get the recipe, remove my name, and write in LLM-Codex as the creator? again, i'm ok with giving the recipe for free, i just want my name out there.
From a legal perspective, it's a pretty clear "no". The instructions in recipes aren't copyrightable. The moral question is more ambiguous, but it's still pretty weak. Most recipes are uncredited, and it's unclear why someone can force everyone to attribute the recipe to them when all they realistically did was tweak the dish a bit. In the example above, I doubt you invented cookies.
i'm curious, do you honestly think the argument was about recipes and cookies? maybe it was an analogy? looking back up the comment tree, it does seem to be an analogy, not a discussion about ACTUAL cookies and ACTUAL recipes.
In that case it's a terrible analogy because if you can't get people to agree on the cookies case, what hope do you have to extend it to the case you're trying to apply the analogy to? It's like saying "You wouldn't pirate a movie, why would you pirate a blog post", because most people would pirate movies.
my comment was about the very human need to be recognized for something created, made, or thought by a person. People are ok with writing blog posts, they're ok with writing software, and they're ok with give it all for free, but they want their name attached and their contribution recognized.
>my comment was about the very human need to be recognized for something created, made, or thought by a person.
And I specifically addressed that aspect:
>The moral question is more ambiguous, but it's still pretty weak. Most recipes are uncredited, and it's unclear why someone can force everyone to attribute the recipe to them when all they realistically did was tweak the dish a bit. In the example above, I doubt you invented cookies.
The cookies analogy was terrible because recipes are rarely credited, but even ignoring the terrible analogy the "recognition" argument still fails. If you wrote a blog post on how to set up kubernetes (or whatever), then it's fair enough that you get recognized for that specific blog post. If my friend asked me how to set up kubernetes, it wouldn't be cool for me to copy paste your blog post and send it over.
However similar to copyright, the recognition you deserve quickly drops off once it moves beyond that specific work. If I absorbed the knowledge from your blog post, then wrote another guide on setting up kubernetes, perhaps updated for my use case, it's unreasonable to require that you be credited. It might be nice, and often times people do, but it's also unreasonable if you wrote an angry letter demanding that you be credited. You weren't the inventor of kubernetes, and you probably got your knowledge of kubernetes from elsewhere (eg. the docs the creators made), so why should everyone have to credit you in perpetuity?
your ability to not address my argument main point is something to behold. can't tell if you're doing on purpose or not.
if humans read my blog posts and then things without credit that would be fine. i like human eyeballs and i like them on my content. that's exactly the purpose of the blog post (_in this particular example_), to get human eyeballs on the content.
but I don't want a $600 amazing laptop, i want a powerful desktop x86 machine with loads of ram and disk space. As cheap as it was a couple of years ago.
Not sure about the memory, but Xeon Scalable/Max ES/QS chips and their boards are still not horribly expensive.
Prior to the crunch, you could have anything from 48-64 cores and a good chunk of RAM (128GB+). If you were inordinately lucky, 56 cores and 64GB of onboard HBM2e was doable for 900-1500 USD.
They’re not Threadrippers or EPYCs,but sort of a in between - server chip that can also make a stout workstation too.
You can have both. You just have to undo the forced bail-in of Millennial and Gen-Z/Alpha/Beta productivity to cover the debts and lifestyles of Silent Gen/Boomer/Gen-X asset holders. The insanity of contemporary markets doesn't reflect anything natural about the world's economic priorities, but instead the privileging of the priorities of that cohort. They've cornered control until enough people call bullshit. So, call bullshit.
ah yes, capitalism is over because the chinese are benevolent people that just give away the goose that lays golden eggs out of the goodness of their hearts
this will continue forever and no rugs, chinese or otherwise, will ever be pulled
we know that because the label on the rug says "open source"
>Why is the reaction of so many people, once their menial work gets automated, "oh no, my menial work is automated." Why is it not "sweet, now I can do bigger/better/more ambitious things?"
because i have rent to pay? old age to prepare for?
why is it so hard to understand most people are not rich, that the cost of living is high, and that most people are VERY afraid their jobs will be automated away? why is so hard to understand that most people haven't worked at FAANG, they don't have stocks or savings, and are squeezed harder with every new day and every new war?
Because there is always work to do. It is true that demand will drop for those that don't take initiative and aren't sure what to do now that AI can do their repetitive tasks. However, demand will surge for those that can think critically about how to utilize AI to empower businesses.
"Software engineer" as a profession is rapidly getting automated at my company, and yet our SWEs are delivering more value than ever before. The layer of abstraction has changed, that is all.
> what world, what reality are you guys living in?!
One that has seen immense benefits from the Industrial Revolution and previous waves of automation.
you might want to brush up on the short and medium consequences of the industrial revolution and the dark satanic mills where children were maimed or where people worked for 12h a day in horrendous conditions.
Do you think because 2 dev are now super productive with AI, the company will keep the other average 30 devs? no, of course not, they will fire and pocket the difference. Same for other industries, where AI will slowly diffuse like a poisonous gas and displace jobs and people, leaving behind a crippled white collar class. The profits will not trickle down and the increased productivity will be a hatchet, not a plough.
> Do you think because 2 dev are now super productive with AI, the company will keep the other average 30 devs? no, of course not, they will fire and pocket the difference
Yes, they will keep the other devs that can figure out how to use AI well. Businesses want to grow.
That hasn't been my experience or the experience of anyone I know or have talked to about how LLMs have affected their work. The parent comment explains what happened.
The businesses fired the staff and pocketed the difference. The result? Growth, at least on paper, as you're saying. Previously they were paying for 10 people and now they're paying for 2 so more profit yay! Of course this is a short term gain which might result in long term pain. That last part remains to be seen.
Working conditions did decline as a result of industrialization. It wasn't until around the 20th century that we could say working conditions were better for most people than pre-industrial society.
> The rapid urbanisation that accompanied the Industrial Revolution in Britain is often argued to have been accompanied by a dramatic worsening of urban conditions [...] However, demographic evidence suggests that death rates were much higher in towns in the seventeenth and eighteenth centuries than in the nineteenth century, and that the Industrial Revolution was accompanied by profound improvements in the survival of urban residents, especially infants and rural migrants.
> early industrialisation coincided with significant improvements in survival, especially in towns (Buer, 2013; Davenport, 2020a; Landers, 1993; Wrigley et al., 1997)
> population growth rates in excess of 1% per year would have resulted in falling real wages and hunger in any previous period [...] the fact that wages kept pace at all with increasing population should be viewed as a major achievement (Crafts and Mills, 2020; Wrigley, 2011).
Davenport, Romola J. (2021). "Mortality, migration and epidemiological change in English cities, 1600–1870." International Journal of Paleopathology, 34, 37–49. PMC7611108.
Nobody argued how many people died during the Industrial Revolution or before; quality of life, on the other hand...
That being said.
You cite a study implying (you, not the study) the Industrial Revolution was what lead to lower death rates, so it's all good.
But that's not what the study says:
> These patterns are better explained by changes in breastfeeding practices and the prevalence or virulence of particular pathogens than by changes in sanitary conditions or poverty. Mortality patterns amongst young adult migrants were affected by a shift from acute to chronic infectious diseases over the period.
"than by changes in sanitary conditions or poverty" [my emphasis]
But wait! there's more! from the same study:
> The available evidence indicates a decline in urban mortality in the period c.1750-1820, especially amongst infants and (probably) rural-urban migrants.
"especially amongst infants and (probably) rural-urban migrants" ...where is the industrial revolution here?
And if that was not enough:
>Mortality at ages 1-4 years demonstrated a more complex pattern, falling between 1750 and 1830 before rising abruptly in the mid-nineteenth century.
"rising abruptly in the mid-nineteenth century"
turns out industrial revolution did in fact raise mortality and death rates
It seems that peak native Windows dev tools were Delphi 7 and VB6. It's a tragedy that something at least as good as VB6 is not still developed and supported by Microsoft.
There's nothing as good as VB6 that's developed and supported by *anyone*. It's not a Microsoft only phenomena.
I think programmers started wanting "real" languages (notice the quotes), and henceforth got more complexity and things take longer, although with GenAI, we may be back to the "draw as screen and do this" that we were with VB6. Just now the source generated should be considered the object code, and the prompt is the new source (at least for those types of apps)
I'm not sure how you define "native" here. If you mean native widgets then WinForms does what you want, is still fully supported, works on modern .NET versions, and Visual Studio still has all the GUI designers etc. WinForms is very obviously a calque of VCL, as well, so it can do everything Delphi did, but better.
If you mean native code then VB6 doesn't belong in this category (even if you compiled it to a standalone .exe it was still effectively bytecode).
reply