Hacker Newsnew | past | comments | ask | show | jobs | submit | laserbeam's commentslogin

I would say the complexity of implementing defer yourself is a bit annoying for C. However defer itself, as a language feature in a C standard is pretty reasonable. It’s a very straightforward concept and fits well within the scope of C, just as it fit within the scope of zig. As long as it’s the zig defer, not the golang one…

I would not introduce zig’s errdeferr though. That one would need additional semantics changes in C to express errors.


>pretty reasonable

It starts out small. Then before you know the language is total shit. Python is a good example.

I am observing a very distinguishable phenomenon when internet makes very shallow ideas mainstream and ruin many many good things that stood the test of time.

I am not saying this is one of those instances, but what the parent comment makes sense to me. You can see another comment who now wants to go further and want destructors in C. Because of internet, such voices can now reach out to each other, gather and cause a change. But before, such voices would have to go through a lot of sensible heads before they would be able to reach each other. In other words, bad ideas got snuffed early before internet, but now they go mainstream easily.

So you see, it starts out slow, but then more and more stuff gets added which diverges more and more from the point.


I get your point, though in the specific case of defer, looks like we both agree it's really a good move. No more spaghetti of goto err_*; in complex initialization functions.

>we both agree it's really a good move

Actually I am not sure I do. It seems to me that even though `defer` is more explicit than destructors, it still falls under "spooky action at a distance" category.


I don't understand why destructors enter the discussion. This is C, there is no destructors. Are you comparing "adding destructors to C" vs "adding defer to C"?

The former would be bring so much in C that it wouldn't be C anymore.

And if your point is "you should switch to C++ to get destructors", then it seems out of topic. By very definition, if we're talking about language X and your answer is "switch to Y", this is an entirely different subject, of very few interest to people programming in X.


Sorry, I had some other thread that involved destructors in my head.

But the point is `defer` is still in "spooky action at a distance" category that I generally don't want in programming languages, especially in c.


Defer is not spooky action at a distance. It is an explicit statement that gets executed as written. Unlike (for example, a familiar feature which C doesn’t have) operator overloading… which causes code that looks like one thing (adition for example) behave like another (a function call). Defer does exactly what it says on the tin can (“move this line to the end of the scope”), just like goto does exactly what it claims to do.

Macros (in general) are way spookier than a defer statement.


> `defer` is still in "spooky action at a distance" category

Agree, this is also why I'm a bit weary of it.

What brings me on the "pro" side is that, defer or not defer, there will need to be some kind of cleanup anyway. It's just a matter of where it is declared, and close to the acquisition is arguably better.

The caveat IMHO is that if a codebase is not consistent in its use, it could be worst.


But the real-world alternatives that people use are:

1. goto, which is "spooky action at a distance" to the nth degree. It's not even safe, you can goto anywhere, even out of scope.

2. cleanup attributes, which are not standard.


>goto, which is "spooky action at a distance" to the nth degree.

it is not.


It is, just the existence of goto makes control flow significantly harder to understand. People complain about exceptions in C++ obfuscating control flow, but then they recreate exceptions using goto. The funny thing is that exceptions are just fancy goto, the assembly is almost the same.

The bigger picture of C as a language is not that it's simple, because it's not simple at all. It's inept. It doesn't give developers the tools to write simple code. So easy things become hard, and we sort of jank together solutions that kind of work but usually don't.

I like to compare it to building a shed with power tools versus only a screwdriver. Is a screwdriver simpler than a power saw and all that? Of course. Now think about building a shed. Is it simpler to do with a screwdriver? No. It's much, much more complex. You have to develop complex processes to make that work, and it's not intuitive at all.

C is a language that already makes use of implicit control flow A LOT. I don't see defer being a problem. The irony is that if C just supported these use cases out-of-the-box, it would be simpler and easier. As a concrete example, consider polymorphism in C versus C++. Both languages can do it, but one provides the tools and one doesn't. In C++ I can go to definition, I can concretely define what polymorphism is allowed and what isn't, and the type system gives me the tools to make it safe. In C, none of that is true, so when we do polymorphism with function pointers, it's much harder to understand what's actually going on, or what could be going on.


It is not. That `GOTO` makes things hard does not shift it to some unrelated category.

> The funny thing is that exceptions are just fancy goto

Not even close.

>I like to compare it to building a shed with power tools versus only a screwdriver.

That analogy is completely wrong in the context.

> it's not intuitive at all.

It is completely intuitive, while the "modern" languages with "helpful" magic is not.

>C is a language that already makes use of implicit control flow A LOT.

> implicit

I don't think it means what you think it means. At least the examples you mention does not justify it.


None of these were arguments. You just said "nuh uh" in more words, which is really just a waste of storage.

>which is really just a waste of storage.

It is, if you choose not to think about those nuh uhs.


That comment is saying to use C++, not to add destructors to C.

Modern Python is great :shrug:

The problem is notepad itself would download and execute bad stuff if you click the evil link. If you would paste that same link in a browser you'd be ok.

And the problem is a notepad app is expected to be dead simple, have few features, and be hard to get wrong while implementing.


So Notepad will download and execute itself rather than launch an appropriate application to handle the URL? That was not clear to me.

Can we somehow get age verification without IDs? Age verification itself is OK as an idea. I’m happy to show ID to buy alcohol at the store… but the store clerk doesn’t take a photo of that ID and store it in logs somewhere forever.

Can we please get a law where kids won’t just take their parents’ IDs and upload them to random places?


You might like the Digital ID scheme. It uses Zero Knowledge Proofs, so that one of your 'IDs' could be a simple 'Is over 18' ZKP, without involving your name or anything other detail. These are not tracked by government or possible to associate with your wider identity. This is one of the examples listed in the framework docs.

> "Unlike with a physical document, when using a digital identity, you can limit the amount of information you share to only what is necessary. For example, if you are asked to prove you are over 18, you could provide a simple yes or no response and avoid sharing any other personal details." (from https://www.gov.uk/guidance/digital-identity )

There's a huge amount of disinformation circulating about the digital ID scheme, and the government's messaging over it has been catastrophically clumsy. Which is a pity, because the system has clearly been designed with civil liberties in mind (ie defensively) and for citizens it's a serious improvement over the current system.


While great on paper, zero-knowledge-proof based systems unfortunately have a fatal flaw. Due to the fully anonymous nature of verification tokens, implementations must have safeguards in place to prevent users from intercepting them and passing them onto someone else; in practice, this will likely be accomplished by making both the authenticator and the target service mobile apps that rely on device integrity APIs. This would ultimately result in the same accessibility issues that currently plague the banking industry, where it is no longer possible to own a bank account in most countries without an unmodified, up-to-date phone and an Apple or Google account that did not get banned for redeeming a gift card.

Furthermore, if implementers are going to be required to verify users per-session rather than only once during signup, such a measure would end up killing desktop Linux (if not desktop PCs as a whole) by making it impossible for any non-locked-down platform to access the vast majority of the web.


I'm unsure how applicable these risks are here. The proofs appear to be bound to the app, which in turn is bound to the user's face/fingerprint (required to unlock it).


> if you are asked to prove you are over 18, you could provide a simple yes or no response and avoid sharing any other personal details

I can't imagine how that would operate, esp. given we're told this ID will not be a digital ID card you can "show".


It's an app, and data is submitted with a tap to approve. The data is just attribute / proof pairs (eg nationality:British / true), and the bundles assembled from these pairs will differ between use cases. Nightclub proof of age would just need the 'over 18' proof, while opening a bank account would need a photo, name, address, date of birth, nationality etc. In other words, there isn't a single Digital ID. The 'ID' is just a container for a specific use. They can be reused, but they will often be single purpose or generated from the attributes saved in your wallet the moment a service requests your data. The best way to think of this is that it gives you a way to pass on your citizen data with authority, and without having to overshare.


Thanks. I don't see that info on the Govt. explainer web page. https://www.gov.uk/government/publications/digital-id-scheme...


The major problem is that no one trusts government not to abuse it and use it to track everything people do. There will be some proportion of people who trust the current government, but will be paranoid that a future government will abuse it, and there will be a proportion of people that don't trust the current government to not abuse it.

You might be able to get more trust by the government assigning a third party to audit the systems to make sure they are working as advertised, and not being abused, but you would still get people being paranoid that either the third party could be corrupted to pretend that things are okay, or that a future government would just fire them and have the system changed to track everyone anyway.

No matter what you do, you will never convince a subset of people that a system that can potentially be used to track everyone won't be abused in that way. Unfortunately, those people are most likely correct. This is why we can't have nice things :(

For the record, I thing it would be great to be able to have a trusted government issued digital ID for some purposes. I especially think it would be great to have an officially issued digital ID that could be used to sign electronic documents. My partner and I moved home recently, and it was not easy signing and exchanging legal documents electronically.


> You might be able to get more trust by the government assigning a third party to audit the systems to make sure they are working as advertised, and not being abused, but you would still get people being paranoid that either the third party could be corrupted to pretend that things are okay, or that a future government would just fire them and have the system changed to track everyone anyway.

The scheme is one step ahead of you, Auditors are required [1]. Government's role in the scheme is limited to operating the API in front of its departments which are read only and scattered (eg no central database), funding the auditors and trust registry (a Digital Verification Service public key store), and legislating. The verification work will all be done by private sector digital verification services - whichever is associated with the wallet app you've chosen. There were 227 of them last year already working for various services - we all benefit from the sector being brought under a formal regulatory framework.

The tracking you fear doesn't seem to be possible beyond what is already tracked when you open a bank account etc, but this is entirely outside the scope of the wallet's operation. It's been designed specifically to make the kind of abuse you fear impossible, at least in its current format, where government is out of the loop except as a passive reference, and the DV services are legally prevented from retaining any data without your consent. Of course that could alter in future, but as it stands the framework doesn't allow for what everyone fears it does.

[1] https://enablingdigitalidentity.blog.gov.uk/2024/10/24/how-a...

(The Enabling Digital Identity Blog has a comprehensive information about every aspect of the framwework.)


For weak bank logins, my guess is that reimbursing all account takeovers is cheaper than having a complex login process that would scare away non-technical customers. Or, well, I could see myself making that decision if I were more versed in finance than in computer science and I had a reasonable risk assessment in front of me to tell me how many account takeovers happen.


Banks aren't even liable for losses from account takeovers, at least if their system is compliant, regardless of whether that makes it secure. Their biggest incentive is customer satisfaction, which fraud does hurt.

It's credit cards that have to reimburse for fraud, but they charge the merchant for it, plus fees, so they have absolutely no incentive to prevent fraud, if not an incentive to outright encourage fraud. That would explain why their implementation of the already compromised EMV was further nerfed by a lack of a PIN in the US.


> Their biggest incentive is customer satisfaction

At a bank? No way. They are some of the most customer-hostile organizations I've interacted with. Dealing with payment accounts is a necessary evil for them, and they are very much aware of the effort required to switch to a different bank, and of the massive regulatory moat preventing consumer-friendly competition from popping up.

A bank doesn't care about screwing over a handful of customers. As long as it's not common enough to draw the attention of the press and/or a regulatory agency, they are not going to spend any money on improving.


Case in point: Wells Fargo foreclosure fraud. Case in point: Wells Fargo opening new accounts in customer names without direction from, approval by, or notification to said customers.

The primary incentive of a bank is to make money rather than customer satisfaction, security, or most other things. Sometimes other priorities suffer in the race to profit, sometimes including regulatory compliance and legality.


I found that moving between empty lines is the nicest way to navigate most code across all programming languages, markup languages and just regular text. I don’t have to think, I don’t have to count, I just move and select text big chunks at a time… (not in vim, but I first saw someone have key bindings for this in vim)


https://vimhelp.org/motion.txt.html#%7B

    { [count] paragraphs backward.  exclusive motion.
    } [count] paragraphs forward.  exclusive motion.


What does exclusive motion mean here?


Motions can be inclusive or exclusive. It works like the different ways of annotating ranges: [0,1] and (0,1).

Consider the command `d` (delete) combined with the motions for `"`.

First we have `da"`, it deletes the everything between the pair of `"` characters that surround my cursor. Next, `di"` deletes the contents of the `"` pair.

The movement `a"` is inclusive (think 'a quote') and `i"` is exclusive (think 'inside quote'). Combined with the command you get "delete a quote" and "delete inside quote" when the mnemonics are spelled out.

https://vimhelp.org/motion.txt.html#exclusive


oh, wow, great info, thanks. i knew about the general concept from high school math (where it is called open and closed intervals) and also about Python ranges, but didn't know about it in connection with vim. Got it now.


Also, I love mnemonics. They make many topics easier to remember.

Related: Sanskrit has tons of them.

https://duckduckgo.com/?t=fpas&q=sanskrit+mnemonics&ia=web


This is the one thing I brought from my time of trying out vim.

I have now set all my editors to move by paragraph with ctrl+up/dn. It fits so well together with ctrl+left/right that I think it should be standard behaviour. I also set up ctrl+shift+up/dn to select, of course.


>I have now set all my editors to move by paragraph with ctrl+up/dn.

It's even simpler with Vim - just one keystroke - { or }.


On a US keyboard layout this is the same number of keys because { and } are Shift+[ and Shift+]


The insane behavior in the post is not that you get fancy completions, but that the completion does not match the preview. If the computer starts doing A when you asked it B, it is equivalent to a trash can.


> I know picking the right defaults is hard

I think we understand that UX problem much better now than developers did back in the 70s. In general, not just for ss/lsof


How will this hit OSS projects which rely heavily on github actions? I’m thinking of projects like nixpkgs, which is the backbone of nixos and always has dozens of actions queued or running. (I am using nix as an example for scale, but I am not involved in the project and my description might be inaccurate. I’m also not familiar with nix’s financials at all.)


> Standard GitHub-hosted or self-hosted runner usage on public repositories will remain free. GitHub Enterprise Server pricing is not impacted by this change.


Unfortunately, those are 2 different problems. It’s easy to have servers store encryption keys to make https work. You only need to encrypt trafic between you and a server for 5 seconds at a time.

It’s hard for personal communications. The server shouldn’t know the keys, and they need to survive for decades.


Someone needs to design a super dumb and robust system where I can safely store all my keys on all devices I use an account. The fact that whatsapp, signal and other platforms tend to have a primary device for keys is bonkers to me. A primary device that can randomly die, get stolen or fall in a lake.

I have lost chat histories more times than I can remember, and I have to be extra diligent about this these days.

I don’t even want to think about pgp when I have to manually take care of this problem. Not because of my own skills, but because I could never make it reliable for my family and friends on their side.


This is a difference in the threat model.

Signal's threat model is that everything around you is hostile to you, except the parties you interact with. You are an undercover rebel in a totalitarian sect which would sacrifice you to Cthulhu if they see your chat history. Losing it is much better than disclosing it.

Your threat model is likely random black hat hackers who would try to get into your communication channels and dig some dirt to blackmail you, or to impersonate you to scam your grandmother out of several thousand dollars. Signal protects quite well against it. But the chance of this happening even in an unencrypted channel is low enough. You don't mind making the security posture somehow weaker, but preserve the possibility to restore your chat history if your secure device is lost or destroyed.

I suppose the problem could be solved by an encrypted backup with a long key which you keep on a piece of paper in your wallet, and / or in a bank in a safe deposit box. Ideally it would be in the format that the `age` utility supports.

But there is no way around that paper with the long code. If this code is stored on your device, and can be copied, it will be copied by some exploit. No matter how inconspicuous a backdoor you are making, somebody will find it and sneak into it. Should it happen in a publicized case, the public opinion will be "XYZ is insecure, run away from it!".


> If this code is stored on your device, and can be copied, it will be copied by some exploit.

Yeah... We really need some key-management hardware where the secrets can be copied by some channel that is not the primary one. This used to be more common, before the IT companies started pushing everything into the cloud.

I have recently started to see computer boards with write protection for the UEFI data, what is a related thing that also did go away because mostly of Microsoft. So, maybe things are changing back.


> I have lost chat histories more times than I can remember, and I have to be extra diligent about this these days.

As per Signal’s diehard proponents, losing chat history is a feature, not a bug (I’m not being facetious when saying this, and you can see comments of this kind in Signal related threads here).

Edited to add: I don’t agree with that premise and have long disliked losing chat history.


I know you are not being facetious. My problem is random Joe on the street sees it as a bug. He really does care more about actually being able to talk with his wife than Signal’s mathematically correct principles. He needs it to be reliable first, secure second.


> He needs it to be reliable first, secure second.

Than he should use something else. I need signal to be secure first, second and third and reliable in edge cases like this a distant number.


Perhaps it’s a marketing problem, then. Signal is marketed as a secure and full-featured alternative to things like WhatsApp and iMessage. Most people start reading that sentence after the word “secure”, and then are surprised and disappointed when a device replacement loses all their history.

I think it would be better if Signal more loudly communicated the drawbacks of its encryption approach up-front, warning away casual users before they get a nasty surprise after storing a lot of important data in Signal.

I’ve heard Signal lovers say the opposite—that getting burned with data loss is somehow educational for or deserved by casual users—and I think that’s asinine and misguided. It’s the equivalent of someone saying “ha! See? You were trading away privacy for convenience and relying on service-provider-readable message history as a record all along, don’t you feel dumb?”, to which most users’ will respond “no, now that you’ve explained the tradeoffs…that is exactly how I want it to work; you can use Signal, but I want iMessage”.

It shouldn’t take data loss to make that understood.


Or compare the nasty surprises lurking in Whatsapp.

We'll see it intentionally backdoored this decade. Signal can afford to, eg, tell the UK or EU to go fuck themselves. Meta won't.


You've been downvoted, but I think that's a fair take. There will always be tension between security and usability; it's difficult (impossible?) to do the absolute best in both metrics.

Signal's development team can decide that they prioritize security over usability to whatever degree they like, and that's their prerogative. That may result in fewer users, and a less than stellar reputation in the usability space, but that's up to them. And if we (the unpaying user base) don't like it, we are free to use something else that better meets our needs.


Maybe an answer is to have a control for each message that you can set to plain text or encrypted based on a cloud backed up key of encrypted based on a key only on this device. The you could message "hi mum, running late" without complications while being able to hard encrypt when you want?


Signal is already complication free (at least until your phone falls in a lake) making the control useless.

(And you probably don't need to worry about losing the 'running late' message in the lake... The need for good encryption and reliable backup on any given message is likely somewhat correlated.)


Yeah, but if use proton for everything else and signal only for my secret world domination plans, traffic analysis will be so much easier…


Congrats on not being one of the people concerned about being targeted by their government, now or in the future.

Hundreds of millions are not so lucky.


(i am a security person who prioritizes security over usability but) you missed the point a bit. If a privacy program is used only by people that have something to hide it turns into a smoking gun. If you care about being targeted by government you should really hope regular people use signal a lot, because government absolutely has (or can procure) a list of people that use signal.


Bro, my mom uses signal with her friends and she's like almost 70. Signal works well enough.


GP here. I agree. I should’ve stated that I don’t like losing chat history and have seen that as a problem with Signal.

I have edited my previous comment to reflect that I don’t like losing chat history.


My company recently really cut back on slack retention. At first I was frustrated, but we all quickly got over it and work carried on getting done at the same pace as before and nothing really got impacted like many of us imagined it might.


That bears little resemblance to the Signal concerns. The reason people are worried about losing their personal messages is not lost productivity.

It's also not even really the same situation. A more apt analogy would be, if switching work laptops sometimes meant you could no longer read any Slack history.


It's fine until you need evidence someone agreed to something months ago but all records have been deleted.


Yeah, mail is the primary source of this.

Once communication with my customers moved to teams. I've had a very hard time to find historical agreements and decisions.

I try very hard to create a robust system for ADR logging now. And not just for system architecture. But for all decisions and agreements in my projects and across changes.


Methinks the better solution here is to get better friends?


Well I don't think most people choose who they work with. Even if you like your team a lot, you might have a discussion with someone from another team or division, and that's where it's useful to have a good chat history haha.


Doesn't really work in an org with 100s of people and where emails are automatically deleted after 6 months.


I expect that some types of people (in middle management, especially) may see the lack of this as a good thing.


A certain type of person sees this as a feature, not a bug.


I'd hate this, slack is an extension of my memory and it being long lived and searchable can be a super power - you don't have to remember all the details of everything, just enough of the who, what, when to find the rest.


Signal has a backup service in beta, that you can use right now.


So, the requirement is a system to store all your keys and that it can be duplicated as many times you wish. It looks like a local password manager, let's say keepass. I use it and have copies of the encrypted db on every device of mine, plus the client to access the passwords. I don't know if it qualifies for dumbness but it feels pretty robust. It survived the fall into the lake test (a river in my case.)

But I see every customer of mine using web based password managers, because they want to share and update passwords with all their team. Of course those password managers can use E2E encryption and many do, but my instinct is that if you are using somebody's else service for your data, you can be locked out from your data.

Anyway, it's the concept of having many passwords and having to manage them that's not dumb enough. The most that people do is letting the browser store and complete passwords. The password can be the same 1234pass on every single site.


Web-based password manager user here! It's worth noting that Bitwarden and 1Password (probably all the others too) let you export all of your data into an encrypted archive, so anyone who does this periodically won't be "locked out".

(Naturally, this requires extra effort on the users' part, so who knows how many are actually using this ability.)


Even better I run vaultwarden on the cheapest AWS instance using tailscale. This lets me make S3 backups of the disk easily.


I set up automatic backups of WhatsApp to my self-hosted Nextcloud once. Since you need 'tested backups', I tried to decrypt these WhatsApp backups independent of my phone, but this was not possible. You need the original device. There are some hacks online, but they are always out of date.

I am tending now to running Mautrix Whatsapp bridge and backing up my data through this.


Ask yourself. If you want things to be encrypted by default in the world, would a florist be able to self host nextcloud?


Agreed. I am still unhappy, but perhaps this is entirely my problem.


Apple/Google passkeys.


Two problems: Apple. And Google.


Indeed, passkeys would seem to represent a step forward from single-device to single-account.


Passkeys are often stored/locked per device?


But then Apple or Google can control your access to any account that uses those passkeys. We need a protocol where I can store the same passkey on multiple cloud providers


my proposal devices is like yubikey but instead of yubikey hardware in place like USB devices form

its in the form of ring or bracelet, its small enough and can be carried everywhere with you all the time

its use NFC like technology, it works without battery, fast and "secure enough" for 99% of people

what if the device is stolen???? we can add authorization like biometric (fingerprint etc) while touching devices so it can be sure the real owner is "giving" auth


The problem is not a personal hardware security module, as you noted we have them. The problem is that people want redundancy that undermines the point. If you can easily have a copy of your ring just in case, how do you know who has done that process and watches you all the time? Biometrics sounds like a solution yet they are implemented as a cosmetic security layer and this situation is pointless to fix since we leave them everywhere we go.


if people want to copy then let them copy

-how is that secure????

we would let only 1 device active at a time

if you think secure enclave with Biometric security is "weak" then no one is secure

if you think combination of (fingerprint,DNA,blood variance,retina, star time + position, mental memory etc) is not enough then no one is enough

(we are assuming this is future where we can access all this technology) << this is important point here

also if this is not enough, ppffttt (I dont want to go here) Neuralink device that lives under your skin


Maybe I'm old but I never expect chat history to be a permanent thing. It's like talking to someone, it should be ephemeral.

If you need a record, use email. Recording and archiving every conversation with someone is just weird.

Thanks for listening, now you dang kids can get off my lawn


There is absolutely no reason not to store and index text chats since they are so little data.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: