Hacker Newsnew | past | comments | ask | show | jobs | submit | tmoertel's commentslogin

For exactly this reason, when I write software, I go out of my way to avoid using external packages. For example, I recently wrote a tool in Python to synchronize weather-statation data to a local database. [1] It took only a little more effort to use the Python standard library to manage the downloads, as opposed to using an external package such as Requests [2], but the result is that I have no dependencies beyond what already comes with Python. I like the peace of mind that comes from not having to worry about a hidden tree of dependencies that could easily some day harbor a Trojan horse.

[1] https://github.com/tmoertel/tempest-personal-weather

[2] https://pypi.org/project/requests/


I always force myself to do this too. The only 3rd party python library I regularly use is "requests" basically —a dependency that comes with its own baggage, see the recent controversy about "chardet"— but I go out of my way to grab it from pip instead installing it via pip. :-)

Something like this:

    try:
        import requests
    except ImportError:
        from pip._vendor import requests

This is good wisdom, and I think this is a strong reason why language and runtime developers should ensure their standard library is (over)complete.

Go does this well, to the point where a lot of people in the community say "you don't need a library" for most use cases, only for e.g. database drivers. This is contrary to what a lot of developers believe, that they need e.g. a REST API library or enterprise application framework as soon as possible.


Is this a win for .NET where the mothership provides almost all what you need?

Definitely!

The amount of third-party (non-testing related) dependencies needed for most .NET applications is very manageable and the dependencies themselves (generally) don't come with further third-party dependencies (especially now that JSON serialisation is native).

This means that for most applications, the developers know exactly which dependencies are needed (and they are not hidden away in large folder structures either, the dlls are right next to the assembly).


.NET is great because you use a FOSS library and then a month later the developer changes the licence and forces you to either pay a subscription for future upgrades or swap it out.

Yeah why is this so common in .NET?

Enterprise usage. Devs know companies will just pay out. Easier than trying to get sponsored.

C#/.NET is a good example showing no matter how much programmers you have, how much capital you hold, it's still impossible to make a 'batteries-included' ecosystems because the real world is simply too vast.

Say what you want but I can write a production backend without any non-Microsoft dependencies. Everything from db and ORM to HTTP pipeline/middleware to json serialization to auth to advanced logging (OTel). Yes, sometimes we opt for 3rd party packages for advanced scenarios but those are few and far between, as opposed to npm/js where the standard library is small and there is little OOTB tooling and your choices are to reinvent a complex wheel or depend on a package that can be exploited. I argue the .NET model is winning the new development ecosystem.

I agree with you, almost all .NET code I write is within the .NET framework, yet when I look at C# repos, it's disheartening to see so many (new) projects just NuGet this NuGet that.

We have text.json now but I still see people use Newtonsoft JSON.

There's old repo's out there that should be archived and deprecated, yet I see new games use it when the repo is very questionable with automated whitespace or comment commits to keep up the appearance that it is still being maintained[0].

Right now I'm working on a Golang project and the dependency tree is a nightmare to say the least, I hope to be able to rip out the parts I don't need and compile it without the massive bulk that I do not need.

It's very frustrating to want me to trust the author, who trust the author, who trust the author. When I doubt they even audited or looked at what they imported.

[0] https://github.com/sta/websocket-sharp


I'm not a fan of that ecosystem, but you make a good point. I wish JS had more basic utilities built in.

and the pendulum swings again the other way...

Does it? Or is it simply different people

I generally limit myself to what's available in my distribution, if the standard library doesn't provide it. But normally I never use requests because it's not worth it I think to have an extra dependency.

This might hold true for easy deps, but (let's be honest who would install is promise) if you have complex or domain specific stuff and you don't have the time to do yourself or the std lib does not have anything then yeh you might still fall into the pit, or you have to trust that the library does not have an supply chain chain issue itself.

But then you rely on Python, C, your editor with all its extensions etc.

I develop as a pure amateur and there are areas I would never get into without libraries.

First are dates, it is a world of pain. Arrow is the answer (in Python)

Then HTML, another world of pain perfectly described in a Stack Overflow answer. Beautifulsoup.

HTTP is arguably easier but requests! :)

At some point there is a risk assessment to do and one should make decisions based on that. Kudos for having gone that way yourself!


> I go out of my way to avoid using external packages.

I go out of my way to avoid Javascript. Because in all my years of writing software, it has 100% of the time been the root cause for vulnerabilities. These days I just use LiveView.


HTMX > Live View

Sure, if that works for you, then great.

> You’re making exactly the same error as the other guy, just in the opposite direction: you’re judging the profession of software engineering based on code output rather than value generation.

But the true metric isn't either one, it's value created net of costs. And those costs include the cost to create the software, the cost to understand and maintain it, the cost of securing it and deploying it and running it, and consequential costs, such as the cost of exploited security holes and the cost of unexpected legal liabilities, say from accidental copyright or patent infringement or from accidental violation of laws such as the Digital Markets Act and Digital Services Act. The use of AI dramatically decreases some of these costs and dramatically increases other costs (in expectation). But the AI hypesters only shine the spotlight on the decreased costs.


The question serious patch trappers ask is whether you patched the traps used to patch traps to make sure that, when you patched traps, no other INIT could patch those traps after you to get in line before you when the traps were handled.

When laying people off, better companies will often accelerate vesting so that the departing employees get additional stock. For example, Google does this:

We’ll also offer a severance package starting at 16 weeks salary plus two weeks for every additional year at Google, and accelerate at least 16 weeks of GSU vesting.

https://blog.google/company-news/inside-google/message-ceo/j...


Google did this in 2023, and this blog was a good PR move to make people think they are still doing this today.


OK but for most people a 16-week acceleration is still forfeiting 92% of unvested shares.


By the same logic, wouldn't 4 months of severance pay be equivalent to forfeiting 92% of salary?

For something paid at regular intervals like RSUs, you really should never be looking at the total value of the grant, and instead think of it in terms of how many shares per paycheck/month/quarter/year you vest.

If you've got a cliff coming up, that's different. I'd be pissed if a company laid me off 11.5 months into a 12 month cliff or a few weeks before an annual bonus and didn't accelerate the vesting / bonus.


That's exactly my point. "Losing" your RSUs is the same as losing all your other income that you did not earn because you don't work there any more.


Would you rather forfeit 100% of unvested shares? Because that's what you signed up for.

The point is, you're getting shares for 16 weeks + ${years_employment * 2} additional weeks, and you don't have to work those weeks. It's comp that, according to the terms you agreed to when you accepted the job, you are not entitled to. You're getting it as an unearned parting gift.

Yes, it sucks to be laid off. But it sucks a little less if you get a parting gift that for many of the more senior folks will run into six figures.

Would you rather have that parting gift or not? Because some companies don't give you one when when they lay you off.


> Dumb question, why do “sensitive” spots on the body need more nerves? Couldn’t you just have the normal touch-sensing nerves and map signals from specific spots on the body to stronger/pleasurable qualia in the brain?

Think of a television. What gives you a better picture, quadrupling the number of pixels or making the existing pixels 4x as intense?


> Historically, what makes people with capital able to turn things into more capital is its ability to buy someone's time and labor.

You forgot to include resources:

What makes people with capital able to turn things into more capital is their ability to buy labor and resources. If people with more capital can generate capital faster than people with less capital, then (unless they are constrained, for example, by law or conscious) the people with the most capital will eventually own effectively all scarce resources, such as land. And that's likely to be a problem for everyone else.


Fair, though I don’t see how AI is really changing the equation here


AI doesn't change the equation; it makes the equation more brutal for people who don't have capital.

If you don't have capital, the only way to get it is by trading resources or labor for it. Most poor people don't have resources, but they do have the ability to do labor that's valued. But AI is a substitute for labor. And as AI gets better, the value of many kinds of labor will go towards zero.

If it was hard for poor people to escape poverty in the past, it's going to be even harder with AI. Unless we change something about the structure of society to ensure that the benefits of AI are shared with poor people.


Ok, I'm following you. You're saying because labor gets cheaper it will be harder to make a living providing labor. Not disagreeing, but I wonder how much weight to give this argument. History shows a precedent of productivity revolutions changing the workforce, but not eliminating it, and lifting the quality of life of the population overall (though it does also create problems). Mixed bag with the arc bending towards betterment for all. You could argue that this moment is unprecedented in history, but unless the human spirit changes, for better or worse, we will adapt as we always have, rich and poor alike.

If the value of many kinds of labor go towards zero, those benefits also go to the poor. ChatGPT has a free tier. The method of escaping poverty will still be the same. Grow yourself. Provide value to your community.


Entire classes of workers have been put in the poorhouse on a near permanent basis due to technological changes, many tines during the past two centuries of industrial civilization. Without systemic structural changes to support the workforce this will happen/is already happening with AI.


> LG makes an induction range with knobs. I have one. It's wonderful.

It is encrapified with a bunch of intrusive "smart" features that nobody asked for?


Just don't connect it to the internet.


This is exactly it. If you don't connect it, it's a dumb stove like any other.

I was extremely dubious about connecting it, but I decided to do it anyway and see whether it's worth it. So far I've noticed two things:

* It sets the clock with NTP and follows daylight savings time. This actually might be worth it, I'm one of those people who otherwise just lives with clocks set an hour wrong for half the year. The odd thing though is that this isn't default behavior, I had to install an add-on in the mobile app.

* It gives me a mobile notification when the oven gets to temp. Not really compelling.

So depending on how you feel about clocks, feel free to skip the wifi setup.


The actual study (1) is observational and makes no causal claim, only that there exists a statistical association between caffeine consumption and dementia. Nevertheless, people are apt to misinterpret the finding as “caffeine consumption prevents dementia”:

Caffeine -> Dementia

However, the two variables would be correlated if the causal arrow were reversed and dementia influenced the propensity to consume caffeine:

Caffeine <- Dementia

And we would also observe the correlation if a person's general health influenced both the propensity to consume caffeine and dementia risk:

Caffeine <- General Health -> Dementia

Since caffeine is a stressor, we would expect to see reduced consumption among people with reduced general health. But we would also expect increased dementia among that same group. So the relationships in the diagram immediately above are plausible and would give rise to a spurious correlation between caffeine consumption and dementia risk.

While studies can try to “control for confounding factors,” it’s easy to overlook or misunderstand the true causal relationships in play, causing spurious correlations. In other words, you can create false “causal” relationships through imperfect identification and control of confounding variables.

In short, take this article’s claims with a suitable dose of suspicion.

(1) https://jamanetwork.com/journals/jama/article-abstract/28447...


This is an amazing explanation and I am going to keep it on hand for future use. In the first sentence causal is typoed as casual


And "However, the two variables would be correlated if the causal arrow were reversed" is missing "also", almost suggesting that the article gets it wrong and the two variables are not correlated because of the placement of the causal arrow...


Thanks for your kind words! And thanks for reporting the typo (now fixed).


>we would expect to see reduced consumption among people with reduced general health.

I would not expect this at all as it goes against my real world observations of people with poor general health consuming caffeine in as high doses. Some of the same causal factors for poor general health, like long work hours and long commutes can lead to increased caffeine consumption.


> Hell, with commercial printers from the likes of Konica Minolta, the print quality for text is better than offset print.

Can you link to some high-resolution comparison scans to support this claim? I find it hard to believe that any toner-based process is going to result in a cleaner page and crisper text than a properly made-ready offset press run.


Quick question: When the interviewer arrived late, did he start by apologizing?


He actually did not! He just proceeded like it was nothing.


Because it was. Was probably this person's 10th interview of the day. They probably only need the simplest of infractions to weed someone out given the absurd volume of applications they receive.


> Because [arriving 15 minutes late to a 30-minute interview] was [nothing].

I'd expect over 95% of both interviewers and interviewees to say that arriving 15 minutes late to a 30-minute interview is very much not nothing; it's a serious breach of what is expected – on both sides of the interview.

If you show up late for an interview, no matter which side of the table you're on, you ought to apologize and, if you're more than a few minutes late, have a good explanation. To do anything less signals that you are an unreliable person. And, when you are representing a company, it makes the company look like it's run by people who don't even understand how to do something as simple as show up on time. It suggests that one of the company's unspoken core values is Dysfunction.


If I had shown up fifteen minutes late for the interview then they likely wouldn’t make an offer and they likely would have called it out during the interview. No one seems to call out companies when they do this shit.

They wouldn’t care if I had a really bad day beforehand, and they certainly wouldn’t assume that I had a good excuse for it.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: