Hacker Newsnew | past | comments | ask | show | jobs | submit | tgummerer's commentslogin

Especially C++ compilers have undefined behavior


Yes. But are you going to to use 3 to 5 or mote C++ compilers on a given project?

The worst part of these frontend issues is, ultimately, the UX can become inconsistent, if not mess. Given that nearly everyone interfaces via the visual UI, you'd think we'd have this sorted out by now.


If your project is a library, I'd expect that people are gonna use it with, like, maybe not 5 but probably 2 or 3 different compilers, and they're gonna be less diligent about upgrading their compiler versions than mainstream browser users.


Thanks. I wasn't thinking about it that way. Good point.

That said, the browser's impact on UX is essential. A lot more people use browsers than compilers. You'd think the UX would matter by now.


Does a single compiler work exactly the same on every architecture? :D


You're using the words wrong. The code may have undefined behaviour.

For example, a race condition is an UB. Compiler won't invent locks that were not specified by the programmer in a code with some raw threads. As a result, CPUs will trash the memory.


As mentioned in another comment, protocol v2 was implemented by a Google employee, and they decided to write a blog post about it. This is not an official git announcement.


It's because a Google employee implemented protocol v2, and wrote a post about it.


> It just works and during all these years my system was never left broken after an update

I find that very hard to believe. While I am still running Arch (gotta love those shiny new features), updates did break something minor fairly frequently due to occasionally incompatible updates, e.g. a missing symbol in a shared object.

Luckily there was only one time where something major broke which took me a few hours to fix.


> updates did break something minor fairly frequently due to occasionally incompatible updates, e.g. a missing symbol in a shared object.

That should never happen with official packages and full updates. I've seen it happen maybe twice in 10 years and it was fixed within an hour.

If you shoot your foot off by performing a partial update (pacman -Sy foo bar baz) or forget to rebuild your AUR packages, then it might happen. But that is on you, not the distro.


Are any?


Facebook is the closest.


They may be, but they're not using their AI knowledge to drive better features or experiences for their users. It seems like they're just using it to deliver better advertising.


Have you seen the f8 video about real-time photo/video enhancements via AI? They very much like to roll that out as soon as it works on most phones.


I think Amazon and Microsoft are closer.


Really?

Amazon recommendations are really bad and Echo is just answering predefined sentences.


I'll say openai, nvidia, and facebook are the 3 closest. Amazon and Microsoft wouldn't be close unless they choose to prioritize it.


Microsoft is actually doing quite well, Kaiming He was at MSRA before he left.


I'm sure in certain niches there are companies and people outperforming Google on AI stuff. Overall? None come to mind.


I would add Baidu as well


Before this keynote (maybe afterward too?) Alexa can do way more things than Google Assistant can.


Integrations!=capability. I use a echo dot and it sucks at voice recognition half the time. I am dumping it very soon and getting a Home.


It's part of capability. Siri's almost useless because it's tied in with Apple services. Google's quite similar. I don't want a walled-garden assistant or to pick a different assistant for every possible service that could help me.


The article goes into more details on that. Basically the fire extinguishers are not powerful enough against lithium-ion battery fires.


If you can upgrade you can use "git stash -p" with pathspecs as well, so you can stash some changes in a specific file without having to go through changes in other files.


You don't really have to roll your own crypto to create such an app. There's always openssl and the signal protocol, which you'd only need to implement without designing anything.

Sure that can go wrong as anything can, but it's far from rolling your own crypto and makes things a lot easier.


I feel like this still ignores most of what Linus said on why git isn't broken. In particular "it's fairly trivial to detect the fingerprints of using this attack" in his Google+ post. https://plus.google.com/+LinusTorvalds/posts/7tp2gYWQugL

And there are already patches on the mailing list for that.


I think the fingerprint argument is pretty weak actually. There is still a lot of unreadable content in git repos, including binary blobs in the kernel.


You don't understand the fingerprint argument. For the specific SHA-1 attack, it's possible to detect, while calculating the SHA-1 hash of an object, whether the bit pattern indicative of this specific attack is present. This is done automatically, without needing any human intervention. This is one of the things which Google released immediately as part of their announcement.

The other thing which people seem to miss is that it requires 6,500 years of GPU computation for the _first_ phase of the SHA1 attack, and 110 years of the GPU compatation for the _second_ phase of the attack. You need to do both phases in order successfully carry out this attack. And even if you do, Google released code so that someone can easily tell if the object they were hashing was one created using this parituclar attach, which required 6,500 + 110 years of GPU computation.

But alas, it's a lot more fun to run around screaming that the sky is falling.....


Thanks, I was wrong when saying "fingerprinting". The fingerprinting technique is actually quite reassuring. I was thinking of that he says

"But if you use git for source control like in the kernel, the stuff you really care about is source code, which is very much a transparent medium. If somebody inserts random odd generated crud in the middle of your source code, you will absolutely notice. " , which I still think is a very weak argument.

It might or might not be true for any particular developer, and his argument does not refute the claim that the SHA1 integrity checks for that code is being rendered useless. I specifically recall that Linus previously described the hashed chain of commits as something which would prevent malicious insertion of code. And this has now, at least to some degree, been compromised.

He did provide some solid countermeasures and migration plans, but I think he could have been more acknowledging to all the people who predicted this attack. It would have been a good idea to prepare for changing hash function eventually.


keep in mind you have to maintain/commit the initial blob and then later the malicious one (again and again, this is no pre-image attack - the initial blob has to have a well designed place with random jazz ready to be replaced)

You could just place a malicious one from the get go and no one would know (or they would know just as much -- blob do rely on virtually unconditional trust)


There is already danger in accepting unreadable content by itself.


True. But I thought that the point of the hashes was to ensure that something which you had already verified (through review or testing or whatever) could not be tampered with without the changes being brought to your attention. And this property does no longer hold.


Yeah, but in your case you would just get the binary, verify it and push it yourself.

If you're using some weird way of getting a binary that you have already verified, but that could somehow differ, and you're hopping that git will catch the difference, you're doing it wrong to begin with.


He doesn't mean manually recognising the fingerprint


Correct. He's talking about the automated method used on shattered.io to detect files which use the attack. See: https://github.com/cr-marcstevens/sha1collisiondetection

They're basically building that into git so that if this specific collision attack is ever used, git will notice and throw a warning/error.


Thanks, I misread that. I meant that he says

"But if you use git for source control like in the kernel, the stuff you really care about is source code, which is very much a transparent medium. If somebody inserts random odd generated crud in the middle of your source code, you will absolutely notice. "

, which I think is a very weak argument.


Please PoC a Git exploit then?


Sure, just wire me 200k.


If you're working on such a massively important git repo with very poor security measures and trust levels at $200k break in status that are practical... yeah, maybe bigger problems.


I'g glad you're willing to discuss things so freely /s


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: